Running as unit: rb-build-arm64_11-129892.service ==================================================================================== Fri Oct 31 21:48:20 UTC 2025 - running /srv/jenkins/bin/reproducible_build.sh (for job reproducible_builder_arm64_11) on jenkins, called using "codethink04-arm64 codethink03-arm64" as arguments. Fri Oct 31 21:48:20 UTC 2025 - actually running "reproducible_build.sh" (md5sum bcb6fe1b50cf4e8eedacd0971a9eb63f) as "/tmp/jenkins-script-6jTjdR93" $ git clone https://salsa.debian.org/qa/jenkins.debian.net.git ; more CONTRIBUTING Fri Oct 31 21:48:20 UTC 2025 - checking /var/lib/jenkins/offline_nodes if codethink04-arm64.debian.net is marked as down. Fri Oct 31 21:48:20 UTC 2025 - checking via ssh if codethink04-arm64.debian.net is up. removed '/tmp/read-only-fs-test-mkzOzr' Fri Oct 31 21:48:21 UTC 2025 - checking /var/lib/jenkins/offline_nodes if codethink03-arm64.debian.net is marked as down. Fri Oct 31 21:48:21 UTC 2025 - checking via ssh if codethink03-arm64.debian.net is up. removed '/tmp/read-only-fs-test-bwhSdc' ok, let's check if sparql-wrapper-python is building anywhere yet… ok, sparql-wrapper-python is not building anywhere… UPDATE 1 ============================================================================= Initialising reproducibly build of sparql-wrapper-python in forky on arm64 on jenkins now. 1st build will be done on codethink04-arm64.debian.net. 2nd build will be done on codethink03-arm64.debian.net. ============================================================================= Fri Oct 31 21:48:23 UTC 2025 I: starting to build sparql-wrapper-python/forky/arm64 on jenkins on '2025-10-31 21:48' Fri Oct 31 21:48:23 UTC 2025 I: The jenkins build log is/was available at https://jenkins.debian.net/userContent/reproducible/debian/build_service/arm64_11/129892/console.log 1761947303 arm64 forky sparql-wrapper-python Fri Oct 31 21:48:23 UTC 2025 I: Downloading source for forky/sparql-wrapper-python=2.0.0-2 --2025-10-31 21:48:23-- http://deb.debian.org/debian/pool/main/s/sparql-wrapper-python/sparql-wrapper-python_2.0.0-2.dsc Connecting to 46.16.76.132:3128... connected. Proxy request sent, awaiting response... 200 OK Length: 2214 (2.2K) [text/prs.lines.tag] Saving to: ‘sparql-wrapper-python_2.0.0-2.dsc’ 0K .. 100% 263M=0s 2025-10-31 21:48:23 (263 MB/s) - ‘sparql-wrapper-python_2.0.0-2.dsc’ saved [2214/2214] --2025-10-31 21:48:23-- http://deb.debian.org/debian/pool/main/s/sparql-wrapper-python/sparql-wrapper-python_2.0.0-2.dsc Connecting to 46.16.76.132:3128... connected. Proxy request sent, awaiting response... 200 OK Length: 2214 (2.2K) [text/prs.lines.tag] Saving to: ‘sparql-wrapper-python_2.0.0-2.dsc’ 0K .. 100% 263M=0s 2025-10-31 21:48:23 (263 MB/s) - ‘sparql-wrapper-python_2.0.0-2.dsc’ saved [2214/2214] Fri Oct 31 21:48:23 UTC 2025 I: sparql-wrapper-python_2.0.0-2.dsc -----BEGIN PGP SIGNED MESSAGE----- Hash: SHA512 Format: 3.0 (quilt) Source: sparql-wrapper-python Binary: python3-sparqlwrapper Architecture: all Version: 2.0.0-2 Maintainer: Debian Python Team Uploaders: Christian M. Amsüss , Homepage: https://rdflib.github.io/sparqlwrapper/ Standards-Version: 4.6.0 Vcs-Browser: https://salsa.debian.org/python-team/packages/sparql-wrapper-python Vcs-Git: https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git Testsuite: autopkgtest-pkg-python Build-Depends: debhelper-compat (= 13), dh-sequence-python3, python3-all, python3-pytest, python3-setuptools, python3-rdflib Package-List: python3-sparqlwrapper deb python optional arch=all Checksums-Sha1: feae58b1ce9b0648039d5a9895818ae2525a3688 131812 sparql-wrapper-python_2.0.0.orig.tar.gz 26567359f40b8c07673e57817ebca78a89bcc552 5692 sparql-wrapper-python_2.0.0-2.debian.tar.xz Checksums-Sha256: 9f2baa0f5cdfdc80ec8f5ad9e4379f15fcd2d0fd7e646c06d6cb7058daa12da1 131812 sparql-wrapper-python_2.0.0.orig.tar.gz dee5ee69d76fe6e7c88e27da9f3c3be25548ba375f49d521167e7a5cc3768eb0 5692 sparql-wrapper-python_2.0.0-2.debian.tar.xz Files: baa200c8f3e8d65ee7c8ba9a7b016dbc 131812 sparql-wrapper-python_2.0.0.orig.tar.gz 6cfaedc25e78869bd3ca332720acc133 5692 sparql-wrapper-python_2.0.0-2.debian.tar.xz -----BEGIN PGP SIGNATURE----- iQJFBAEBCgAvFiEEj23hBDd/OxHnQXSHMfMURUShdBoFAmZ7yfARHHRjaGV0QGRl Ymlhbi5vcmcACgkQMfMURUShdBpYSxAArFloUjy79P4VJMYgZ9mCXFkmum3wxvfd q1aLDS7o22WD1Z4sN30/JOV5eDLnIX7u0QaAZxK8Sm2Mzj+TveWFFsyCb2fQxSrX ABHDKS7BUE8Lu2N7YR6ZD/PJwOlnZ9f6HS4ktIJ2H1N6ZX+KXyUrQ5VbV49N9UYW W1xh1VuJwI8tZyIi1IBlaqos0O70i9vOUURQdXekIapRo9qYgjIsElavuS8YBPTa Bf6Se9U7T3ra3+r8cnL2qSaM0Zf4iSMLpkUIZAvgz2i4hMNnVsR6pnqK5IA9VaAP CQjcOcVAI9DMR1jTg/nMWLp3IG+6ACnHWIL3GkZUJZEH4SCD0pBSsJWPIk+jkQrV Eo72Qer3IdpwotKEb2QG9DFIjrPq7LgzE/VyCWOGNQPkLdJzOLvpNgkTHjWwRvlP +BTtCDsRYWQXTBLP4l2wH8FeGUpme31XLJZQT4hHV3OvZ44VkgXIgZZ+x6dJMmxe QYIE6OKPxVgOfJYkv3XFNSfu3UhAEIX3NGFeW77SX0tYRVx4Jll/AJKYU42zLvvv ph1iIVm0z/QuSpMt6rLsIJdiOhENNrwwwZ5F5knZXD8eP8w/9JVroWF1O4c5VvjI pdTZU31clAV2z7YtzTe8MMGE+ylINefkCn1q/BEKLGta08b/c22pmayI3Og1n698 vSybjMqJuHs= =Wspx -----END PGP SIGNATURE----- Fri Oct 31 21:48:23 UTC 2025 I: Checking whether the package is not for us Fri Oct 31 21:48:23 UTC 2025 I: Starting 1st build on remote node codethink04-arm64.debian.net. Fri Oct 31 21:48:23 UTC 2025 I: Preparing to do remote build '1' on codethink04-arm64.debian.net. Fri Oct 31 21:48:23 UTC 2025 - checking /var/lib/jenkins/offline_nodes if codethink04-arm64.debian.net is marked as down. Fri Oct 31 21:48:23 UTC 2025 - checking via ssh if codethink04-arm64.debian.net is up. removed '/tmp/read-only-fs-test-mmcqLK' ==================================================================================== Fri Oct 31 21:48:24 UTC 2025 - running /srv/jenkins/bin/reproducible_build.sh (for job /srv/jenkins/bin/reproducible_build.sh) on codethink04-arm64, called using "1 sparql-wrapper-python forky /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy 2.0.0-2" as arguments. Fri Oct 31 21:48:24 UTC 2025 - actually running "reproducible_build.sh" (md5sum bcb6fe1b50cf4e8eedacd0971a9eb63f) as "/tmp/jenkins-script-mEuEB89r" $ git clone https://salsa.debian.org/qa/jenkins.debian.net.git ; more CONTRIBUTING Fri Oct 31 21:48:24 UTC 2025 I: Downloading source for forky/sparql-wrapper-python=2.0.0-2 Reading package lists... NOTICE: 'sparql-wrapper-python' packaging is maintained in the 'Git' version control system at: https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git Please use: git clone https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git to retrieve the latest (possibly unreleased) updates to the package. Need to get 140 kB of source archives. Get:1 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (dsc) [2214 B] Get:2 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (tar) [132 kB] Get:3 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (diff) [5692 B] Fetched 140 kB in 0s (2016 kB/s) Download complete and in download only mode Reading package lists... NOTICE: 'sparql-wrapper-python' packaging is maintained in the 'Git' version control system at: https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git Please use: git clone https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git to retrieve the latest (possibly unreleased) updates to the package. Need to get 140 kB of source archives. Get:1 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (dsc) [2214 B] Get:2 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (tar) [132 kB] Get:3 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (diff) [5692 B] Fetched 140 kB in 0s (2016 kB/s) Download complete and in download only mode ============================================================================= Building sparql-wrapper-python in forky on arm64 on codethink04-arm64 now. Date: Fri Oct 31 21:48:24 GMT 2025 Date UTC: Fri Oct 31 21:48:24 UTC 2025 ============================================================================= W: /root/.pbuilderrc does not exist I: Logging to b1/build.log I: pbuilder: network access will be disabled during build I: Current time: Fri Oct 31 09:48:24 -12 2025 I: pbuilder-time-stamp: 1761947304 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/forky-reproducible-base.tgz] I: copying local configuration W: --override-config is not set; not updating apt.conf Read the manpage for details. I: mounting /proc filesystem I: mounting /sys filesystem I: creating /{dev,run}/shm I: mounting /dev/pts filesystem I: redirecting /dev/ptmx to /dev/pts/ptmx I: policy-rc.d already exists I: Copying source file I: copying [sparql-wrapper-python_2.0.0-2.dsc] I: copying [./sparql-wrapper-python_2.0.0.orig.tar.gz] I: copying [./sparql-wrapper-python_2.0.0-2.debian.tar.xz] I: Extracting source dpkg-source: warning: cannot verify inline signature for ./sparql-wrapper-python_2.0.0-2.dsc: no acceptable signature found dpkg-source: info: extracting sparql-wrapper-python in sparql-wrapper-python-2.0.0 dpkg-source: info: unpacking sparql-wrapper-python_2.0.0.orig.tar.gz dpkg-source: info: unpacking sparql-wrapper-python_2.0.0-2.debian.tar.xz I: Not using root during the build. I: Installing the build-deps I: user script /srv/workspace/pbuilder/4132079/tmp/hooks/D02_print_environment starting I: set BUILDDIR='/build/reproducible-path' BUILDUSERGECOS='first user,first room,first work-phone,first home-phone,first other' BUILDUSERNAME='pbuilder1' BUILD_ARCH='arm64' DEBIAN_FRONTEND='noninteractive' DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=12 ' DISTRIBUTION='forky' HOME='/root' HOST_ARCH='arm64' IFS=' ' INVOCATION_ID='7c7dec4a095148debb57cd30c0788beb' LANG='C' LANGUAGE='en_US:en' LC_ALL='C' MAIL='/var/mail/root' OPTIND='1' PATH='/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' PBCURRENTCOMMANDLINEOPERATION='build' PBUILDER_OPERATION='build' PBUILDER_PKGDATADIR='/usr/share/pbuilder' PBUILDER_PKGLIBDIR='/usr/lib/pbuilder' PBUILDER_SYSCONFDIR='/etc' PPID='4132079' PS1='# ' PS2='> ' PS4='+ ' PWD='/' SHELL='/bin/bash' SHLVL='2' SUDO_COMMAND='/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/pbuilderrc_fN2b --distribution forky --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/forky-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b1 --logfile b1/build.log sparql-wrapper-python_2.0.0-2.dsc' SUDO_GID='109' SUDO_HOME='/var/lib/jenkins' SUDO_UID='104' SUDO_USER='jenkins' TERM='unknown' TZ='/usr/share/zoneinfo/Etc/GMT+12' USER='root' _='/usr/bin/systemd-run' http_proxy='http://192.168.101.4:3128' I: uname -a Linux codethink04-arm64 6.12.48+deb13-cloud-arm64 #1 SMP Debian 6.12.48-1 (2025-09-20) aarch64 GNU/Linux I: ls -l /bin lrwxrwxrwx 1 root root 7 Aug 10 12:30 /bin -> usr/bin I: user script /srv/workspace/pbuilder/4132079/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy Version: 0.invalid.0 Architecture: arm64 Maintainer: Debian Pbuilder Team Description: Dummy package to satisfy dependencies with aptitude - created by pbuilder This package was created automatically by pbuilder to satisfy the build-dependencies of the package being currently built. Depends: debhelper-compat (= 13), dh-sequence-python3, python3-all, python3-pytest, python3-setuptools, python3-rdflib dpkg-deb: building package 'pbuilder-satisfydepends-dummy' in '/tmp/satisfydepends-aptitude/pbuilder-satisfydepends-dummy.deb'. Selecting previously unselected package pbuilder-satisfydepends-dummy. (Reading database ... 19971 files and directories currently installed.) Preparing to unpack .../pbuilder-satisfydepends-dummy.deb ... Unpacking pbuilder-satisfydepends-dummy (0.invalid.0) ... dpkg: pbuilder-satisfydepends-dummy: dependency problems, but configuring anyway as you requested: pbuilder-satisfydepends-dummy depends on debhelper-compat (= 13); however: Package debhelper-compat is not installed. pbuilder-satisfydepends-dummy depends on dh-sequence-python3; however: Package dh-sequence-python3 is not installed. pbuilder-satisfydepends-dummy depends on python3-all; however: Package python3-all is not installed. pbuilder-satisfydepends-dummy depends on python3-pytest; however: Package python3-pytest is not installed. pbuilder-satisfydepends-dummy depends on python3-setuptools; however: Package python3-setuptools is not installed. pbuilder-satisfydepends-dummy depends on python3-rdflib; however: Package python3-rdflib is not installed. Setting up pbuilder-satisfydepends-dummy (0.invalid.0) ... Reading package lists... Building dependency tree... Reading state information... Initializing package states... Writing extended state information... Building tag database... pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) The following NEW packages will be installed: autoconf{a} automake{a} autopoint{a} autotools-dev{a} bsdextrautils{a} debhelper{a} dh-autoreconf{a} dh-python{a} dh-strip-nondeterminism{a} dwz{a} file{a} gettext{a} gettext-base{a} groff-base{a} intltool-debian{a} libarchive-zip-perl{a} libdebhelper-perl{a} libelf1t64{a} libexpat1{a} libffi8{a} libfile-stripnondeterminism-perl{a} libmagic-mgc{a} libmagic1t64{a} libpipeline1{a} libpython3-stdlib{a} libpython3.13-minimal{a} libpython3.13-stdlib{a} libreadline8t64{a} libtool{a} libuchardet0{a} libunistring5{a} libxml2-16{a} m4{a} man-db{a} media-types{a} netbase{a} po-debconf{a} python3{a} python3-all{a} python3-autocommand{a} python3-inflect{a} python3-iniconfig{a} python3-jaraco.context{a} python3-jaraco.functools{a} python3-jaraco.text{a} python3-minimal{a} python3-more-itertools{a} python3-packaging{a} python3-pkg-resources{a} python3-pluggy{a} python3-pygments{a} python3-pyparsing{a} python3-pytest{a} python3-rdflib{a} python3-setuptools{a} python3-typeguard{a} python3-typing-extensions{a} python3-zipp{a} python3.13{a} python3.13-minimal{a} readline-common{a} sensible-utils{a} tzdata{a} The following packages are RECOMMENDED but will NOT be installed: ca-certificates curl libarchive-cpio-perl libltdl-dev libmail-sendmail-perl lynx python3-html5rdf python3-lxml python3-networkx python3-orjson wget 0 packages upgraded, 63 newly installed, 0 to remove and 0 not upgraded. Need to get 20.5 MB of archives. After unpacking 88.4 MB will be used. Writing extended state information... Get: 1 http://deb.debian.org/debian forky/main arm64 libexpat1 arm64 2.7.3-1 [96.5 kB] Get: 2 http://deb.debian.org/debian forky/main arm64 libpython3.13-minimal arm64 3.13.9-1 [858 kB] Get: 3 http://deb.debian.org/debian forky/main arm64 python3.13-minimal arm64 3.13.9-1 [2061 kB] Get: 4 http://deb.debian.org/debian forky/main arm64 python3-minimal arm64 3.13.7-1 [27.2 kB] Get: 5 http://deb.debian.org/debian forky/main arm64 media-types all 14.0.0 [30.8 kB] Get: 6 http://deb.debian.org/debian forky/main arm64 netbase all 6.5 [12.4 kB] Get: 7 http://deb.debian.org/debian forky/main arm64 tzdata all 2025b-5 [260 kB] Get: 8 http://deb.debian.org/debian forky/main arm64 libffi8 arm64 3.5.2-2 [21.5 kB] Get: 9 http://deb.debian.org/debian forky/main arm64 readline-common all 8.3-3 [74.8 kB] Get: 10 http://deb.debian.org/debian forky/main arm64 libreadline8t64 arm64 8.3-3 [169 kB] Get: 11 http://deb.debian.org/debian forky/main arm64 libpython3.13-stdlib arm64 3.13.9-1 [1900 kB] Get: 12 http://deb.debian.org/debian forky/main arm64 python3.13 arm64 3.13.9-1 [764 kB] Get: 13 http://deb.debian.org/debian forky/main arm64 libpython3-stdlib arm64 3.13.7-1 [10.2 kB] Get: 14 http://deb.debian.org/debian forky/main arm64 python3 arm64 3.13.7-1 [28.3 kB] Get: 15 http://deb.debian.org/debian forky/main arm64 sensible-utils all 0.0.26 [27.0 kB] Get: 16 http://deb.debian.org/debian forky/main arm64 libmagic-mgc arm64 1:5.46-5 [338 kB] Get: 17 http://deb.debian.org/debian forky/main arm64 libmagic1t64 arm64 1:5.46-5 [103 kB] Get: 18 http://deb.debian.org/debian forky/main arm64 file arm64 1:5.46-5 [43.7 kB] Get: 19 http://deb.debian.org/debian forky/main arm64 gettext-base arm64 0.23.1-2+b1 [241 kB] Get: 20 http://deb.debian.org/debian forky/main arm64 libuchardet0 arm64 0.0.8-2 [69.0 kB] Get: 21 http://deb.debian.org/debian forky/main arm64 groff-base arm64 1.23.0-9 [1130 kB] Get: 22 http://deb.debian.org/debian forky/main arm64 bsdextrautils arm64 2.41.2-4 [97.3 kB] Get: 23 http://deb.debian.org/debian forky/main arm64 libpipeline1 arm64 1.5.8-1 [40.2 kB] Get: 24 http://deb.debian.org/debian forky/main arm64 man-db arm64 2.13.1-1 [1453 kB] Get: 25 http://deb.debian.org/debian forky/main arm64 m4 arm64 1.4.20-2 [315 kB] Get: 26 http://deb.debian.org/debian forky/main arm64 autoconf all 2.72-3.1 [494 kB] Get: 27 http://deb.debian.org/debian forky/main arm64 autotools-dev all 20240727.1 [60.2 kB] Get: 28 http://deb.debian.org/debian forky/main arm64 automake all 1:1.18.1-2 [877 kB] Get: 29 http://deb.debian.org/debian forky/main arm64 autopoint all 0.23.1-2 [770 kB] Get: 30 http://deb.debian.org/debian forky/main arm64 libdebhelper-perl all 13.28 [92.4 kB] Get: 31 http://deb.debian.org/debian forky/main arm64 libtool all 2.5.4-7 [540 kB] Get: 32 http://deb.debian.org/debian forky/main arm64 dh-autoreconf all 21 [12.2 kB] Get: 33 http://deb.debian.org/debian forky/main arm64 libarchive-zip-perl all 1.68-1 [104 kB] Get: 34 http://deb.debian.org/debian forky/main arm64 libfile-stripnondeterminism-perl all 1.15.0-1 [19.9 kB] Get: 35 http://deb.debian.org/debian forky/main arm64 dh-strip-nondeterminism all 1.15.0-1 [8812 B] Get: 36 http://deb.debian.org/debian forky/main arm64 libelf1t64 arm64 0.193-3 [189 kB] Get: 37 http://deb.debian.org/debian forky/main arm64 dwz arm64 0.16-2 [100 kB] Get: 38 http://deb.debian.org/debian forky/main arm64 libunistring5 arm64 1.3-2 [453 kB] Get: 39 http://deb.debian.org/debian forky/main arm64 libxml2-16 arm64 2.14.6+dfsg-0.1 [601 kB] Get: 40 http://deb.debian.org/debian forky/main arm64 gettext arm64 0.23.1-2+b1 [1612 kB] Get: 41 http://deb.debian.org/debian forky/main arm64 intltool-debian all 0.35.0+20060710.6 [22.9 kB] Get: 42 http://deb.debian.org/debian forky/main arm64 po-debconf all 1.0.21+nmu1 [248 kB] Get: 43 http://deb.debian.org/debian forky/main arm64 debhelper all 13.28 [941 kB] Get: 44 http://deb.debian.org/debian forky/main arm64 dh-python all 6.20250414 [116 kB] Get: 45 http://deb.debian.org/debian forky/main arm64 python3-all arm64 3.13.7-1 [1044 B] Get: 46 http://deb.debian.org/debian forky/main arm64 python3-autocommand all 2.2.2-3 [13.6 kB] Get: 47 http://deb.debian.org/debian forky/main arm64 python3-more-itertools all 10.8.0-1 [71.7 kB] Get: 48 http://deb.debian.org/debian forky/main arm64 python3-typing-extensions all 4.15.0-1 [92.4 kB] Get: 49 http://deb.debian.org/debian forky/main arm64 python3-typeguard all 4.4.4-1 [37.1 kB] Get: 50 http://deb.debian.org/debian forky/main arm64 python3-inflect all 7.5.0-1 [33.0 kB] Get: 51 http://deb.debian.org/debian forky/main arm64 python3-iniconfig all 2.1.0-1 [7432 B] Get: 52 http://deb.debian.org/debian forky/main arm64 python3-jaraco.functools all 4.1.0-1 [12.0 kB] Get: 53 http://deb.debian.org/debian forky/main arm64 python3-pkg-resources all 78.1.1-0.1 [224 kB] Get: 54 http://deb.debian.org/debian forky/main arm64 python3-jaraco.text all 4.0.0-1 [11.4 kB] Get: 55 http://deb.debian.org/debian forky/main arm64 python3-zipp all 3.23.0-1 [11.0 kB] Get: 56 http://deb.debian.org/debian forky/main arm64 python3-setuptools all 78.1.1-0.1 [738 kB] Get: 57 http://deb.debian.org/debian forky/main arm64 python3-jaraco.context all 6.0.1-1 [8276 B] Get: 58 http://deb.debian.org/debian forky/main arm64 python3-packaging all 25.0-1 [56.6 kB] Get: 59 http://deb.debian.org/debian forky/main arm64 python3-pluggy all 1.6.0-1 [27.1 kB] Get: 60 http://deb.debian.org/debian forky/main arm64 python3-pygments all 2.18.0+dfsg-2 [836 kB] Get: 61 http://deb.debian.org/debian forky/main arm64 python3-pyparsing all 3.1.3-1 [148 kB] Get: 62 http://deb.debian.org/debian forky/main arm64 python3-pytest all 8.4.2-1 [266 kB] Get: 63 http://deb.debian.org/debian forky/main arm64 python3-rdflib all 7.1.1-3 [472 kB] Fetched 20.5 MB in 0s (87.3 MB/s) Preconfiguring packages ... Selecting previously unselected package libexpat1:arm64. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19971 files and directories currently installed.) Preparing to unpack .../libexpat1_2.7.3-1_arm64.deb ... Unpacking libexpat1:arm64 (2.7.3-1) ... Selecting previously unselected package libpython3.13-minimal:arm64. Preparing to unpack .../libpython3.13-minimal_3.13.9-1_arm64.deb ... Unpacking libpython3.13-minimal:arm64 (3.13.9-1) ... Selecting previously unselected package python3.13-minimal. Preparing to unpack .../python3.13-minimal_3.13.9-1_arm64.deb ... Unpacking python3.13-minimal (3.13.9-1) ... Setting up libpython3.13-minimal:arm64 (3.13.9-1) ... Setting up libexpat1:arm64 (2.7.3-1) ... Setting up python3.13-minimal (3.13.9-1) ... Selecting previously unselected package python3-minimal. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 20305 files and directories currently installed.) Preparing to unpack .../0-python3-minimal_3.13.7-1_arm64.deb ... Unpacking python3-minimal (3.13.7-1) ... Selecting previously unselected package media-types. Preparing to unpack .../1-media-types_14.0.0_all.deb ... Unpacking media-types (14.0.0) ... Selecting previously unselected package netbase. Preparing to unpack .../2-netbase_6.5_all.deb ... Unpacking netbase (6.5) ... Selecting previously unselected package tzdata. Preparing to unpack .../3-tzdata_2025b-5_all.deb ... Unpacking tzdata (2025b-5) ... Selecting previously unselected package libffi8:arm64. Preparing to unpack .../4-libffi8_3.5.2-2_arm64.deb ... Unpacking libffi8:arm64 (3.5.2-2) ... Selecting previously unselected package readline-common. Preparing to unpack .../5-readline-common_8.3-3_all.deb ... Unpacking readline-common (8.3-3) ... Selecting previously unselected package libreadline8t64:arm64. Preparing to unpack .../6-libreadline8t64_8.3-3_arm64.deb ... Adding 'diversion of /lib/aarch64-linux-gnu/libhistory.so.8 to /lib/aarch64-linux-gnu/libhistory.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/aarch64-linux-gnu/libhistory.so.8.2 to /lib/aarch64-linux-gnu/libhistory.so.8.2.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/aarch64-linux-gnu/libreadline.so.8 to /lib/aarch64-linux-gnu/libreadline.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/aarch64-linux-gnu/libreadline.so.8.2 to /lib/aarch64-linux-gnu/libreadline.so.8.2.usr-is-merged by libreadline8t64' Unpacking libreadline8t64:arm64 (8.3-3) ... Selecting previously unselected package libpython3.13-stdlib:arm64. Preparing to unpack .../7-libpython3.13-stdlib_3.13.9-1_arm64.deb ... Unpacking libpython3.13-stdlib:arm64 (3.13.9-1) ... Selecting previously unselected package python3.13. Preparing to unpack .../8-python3.13_3.13.9-1_arm64.deb ... Unpacking python3.13 (3.13.9-1) ... Selecting previously unselected package libpython3-stdlib:arm64. Preparing to unpack .../9-libpython3-stdlib_3.13.7-1_arm64.deb ... Unpacking libpython3-stdlib:arm64 (3.13.7-1) ... Setting up python3-minimal (3.13.7-1) ... Selecting previously unselected package python3. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 21320 files and directories currently installed.) Preparing to unpack .../00-python3_3.13.7-1_arm64.deb ... Unpacking python3 (3.13.7-1) ... Selecting previously unselected package sensible-utils. Preparing to unpack .../01-sensible-utils_0.0.26_all.deb ... Unpacking sensible-utils (0.0.26) ... Selecting previously unselected package libmagic-mgc. Preparing to unpack .../02-libmagic-mgc_1%3a5.46-5_arm64.deb ... Unpacking libmagic-mgc (1:5.46-5) ... Selecting previously unselected package libmagic1t64:arm64. Preparing to unpack .../03-libmagic1t64_1%3a5.46-5_arm64.deb ... Unpacking libmagic1t64:arm64 (1:5.46-5) ... Selecting previously unselected package file. Preparing to unpack .../04-file_1%3a5.46-5_arm64.deb ... Unpacking file (1:5.46-5) ... Selecting previously unselected package gettext-base. Preparing to unpack .../05-gettext-base_0.23.1-2+b1_arm64.deb ... Unpacking gettext-base (0.23.1-2+b1) ... Selecting previously unselected package libuchardet0:arm64. Preparing to unpack .../06-libuchardet0_0.0.8-2_arm64.deb ... Unpacking libuchardet0:arm64 (0.0.8-2) ... Selecting previously unselected package groff-base. Preparing to unpack .../07-groff-base_1.23.0-9_arm64.deb ... Unpacking groff-base (1.23.0-9) ... Selecting previously unselected package bsdextrautils. Preparing to unpack .../08-bsdextrautils_2.41.2-4_arm64.deb ... Unpacking bsdextrautils (2.41.2-4) ... Selecting previously unselected package libpipeline1:arm64. Preparing to unpack .../09-libpipeline1_1.5.8-1_arm64.deb ... Unpacking libpipeline1:arm64 (1.5.8-1) ... Selecting previously unselected package man-db. Preparing to unpack .../10-man-db_2.13.1-1_arm64.deb ... Unpacking man-db (2.13.1-1) ... Selecting previously unselected package m4. Preparing to unpack .../11-m4_1.4.20-2_arm64.deb ... Unpacking m4 (1.4.20-2) ... Selecting previously unselected package autoconf. Preparing to unpack .../12-autoconf_2.72-3.1_all.deb ... Unpacking autoconf (2.72-3.1) ... Selecting previously unselected package autotools-dev. Preparing to unpack .../13-autotools-dev_20240727.1_all.deb ... Unpacking autotools-dev (20240727.1) ... Selecting previously unselected package automake. Preparing to unpack .../14-automake_1%3a1.18.1-2_all.deb ... Unpacking automake (1:1.18.1-2) ... Selecting previously unselected package autopoint. Preparing to unpack .../15-autopoint_0.23.1-2_all.deb ... Unpacking autopoint (0.23.1-2) ... Selecting previously unselected package libdebhelper-perl. Preparing to unpack .../16-libdebhelper-perl_13.28_all.deb ... Unpacking libdebhelper-perl (13.28) ... Selecting previously unselected package libtool. Preparing to unpack .../17-libtool_2.5.4-7_all.deb ... Unpacking libtool (2.5.4-7) ... Selecting previously unselected package dh-autoreconf. Preparing to unpack .../18-dh-autoreconf_21_all.deb ... Unpacking dh-autoreconf (21) ... Selecting previously unselected package libarchive-zip-perl. Preparing to unpack .../19-libarchive-zip-perl_1.68-1_all.deb ... Unpacking libarchive-zip-perl (1.68-1) ... Selecting previously unselected package libfile-stripnondeterminism-perl. Preparing to unpack .../20-libfile-stripnondeterminism-perl_1.15.0-1_all.deb ... Unpacking libfile-stripnondeterminism-perl (1.15.0-1) ... Selecting previously unselected package dh-strip-nondeterminism. Preparing to unpack .../21-dh-strip-nondeterminism_1.15.0-1_all.deb ... Unpacking dh-strip-nondeterminism (1.15.0-1) ... Selecting previously unselected package libelf1t64:arm64. Preparing to unpack .../22-libelf1t64_0.193-3_arm64.deb ... Unpacking libelf1t64:arm64 (0.193-3) ... Selecting previously unselected package dwz. Preparing to unpack .../23-dwz_0.16-2_arm64.deb ... Unpacking dwz (0.16-2) ... Selecting previously unselected package libunistring5:arm64. Preparing to unpack .../24-libunistring5_1.3-2_arm64.deb ... Unpacking libunistring5:arm64 (1.3-2) ... Selecting previously unselected package libxml2-16:arm64. Preparing to unpack .../25-libxml2-16_2.14.6+dfsg-0.1_arm64.deb ... Unpacking libxml2-16:arm64 (2.14.6+dfsg-0.1) ... Selecting previously unselected package gettext. Preparing to unpack .../26-gettext_0.23.1-2+b1_arm64.deb ... Unpacking gettext (0.23.1-2+b1) ... Selecting previously unselected package intltool-debian. Preparing to unpack .../27-intltool-debian_0.35.0+20060710.6_all.deb ... Unpacking intltool-debian (0.35.0+20060710.6) ... Selecting previously unselected package po-debconf. Preparing to unpack .../28-po-debconf_1.0.21+nmu1_all.deb ... Unpacking po-debconf (1.0.21+nmu1) ... Selecting previously unselected package debhelper. Preparing to unpack .../29-debhelper_13.28_all.deb ... Unpacking debhelper (13.28) ... Selecting previously unselected package dh-python. Preparing to unpack .../30-dh-python_6.20250414_all.deb ... Unpacking dh-python (6.20250414) ... Selecting previously unselected package python3-all. Preparing to unpack .../31-python3-all_3.13.7-1_arm64.deb ... Unpacking python3-all (3.13.7-1) ... Selecting previously unselected package python3-autocommand. Preparing to unpack .../32-python3-autocommand_2.2.2-3_all.deb ... Unpacking python3-autocommand (2.2.2-3) ... Selecting previously unselected package python3-more-itertools. Preparing to unpack .../33-python3-more-itertools_10.8.0-1_all.deb ... Unpacking python3-more-itertools (10.8.0-1) ... Selecting previously unselected package python3-typing-extensions. Preparing to unpack .../34-python3-typing-extensions_4.15.0-1_all.deb ... Unpacking python3-typing-extensions (4.15.0-1) ... Selecting previously unselected package python3-typeguard. Preparing to unpack .../35-python3-typeguard_4.4.4-1_all.deb ... Unpacking python3-typeguard (4.4.4-1) ... Selecting previously unselected package python3-inflect. Preparing to unpack .../36-python3-inflect_7.5.0-1_all.deb ... Unpacking python3-inflect (7.5.0-1) ... Selecting previously unselected package python3-iniconfig. Preparing to unpack .../37-python3-iniconfig_2.1.0-1_all.deb ... Unpacking python3-iniconfig (2.1.0-1) ... Selecting previously unselected package python3-jaraco.functools. Preparing to unpack .../38-python3-jaraco.functools_4.1.0-1_all.deb ... Unpacking python3-jaraco.functools (4.1.0-1) ... Selecting previously unselected package python3-pkg-resources. Preparing to unpack .../39-python3-pkg-resources_78.1.1-0.1_all.deb ... Unpacking python3-pkg-resources (78.1.1-0.1) ... Selecting previously unselected package python3-jaraco.text. Preparing to unpack .../40-python3-jaraco.text_4.0.0-1_all.deb ... Unpacking python3-jaraco.text (4.0.0-1) ... Selecting previously unselected package python3-zipp. Preparing to unpack .../41-python3-zipp_3.23.0-1_all.deb ... Unpacking python3-zipp (3.23.0-1) ... Selecting previously unselected package python3-setuptools. Preparing to unpack .../42-python3-setuptools_78.1.1-0.1_all.deb ... Unpacking python3-setuptools (78.1.1-0.1) ... Selecting previously unselected package python3-jaraco.context. Preparing to unpack .../43-python3-jaraco.context_6.0.1-1_all.deb ... Unpacking python3-jaraco.context (6.0.1-1) ... Selecting previously unselected package python3-packaging. Preparing to unpack .../44-python3-packaging_25.0-1_all.deb ... Unpacking python3-packaging (25.0-1) ... Selecting previously unselected package python3-pluggy. Preparing to unpack .../45-python3-pluggy_1.6.0-1_all.deb ... Unpacking python3-pluggy (1.6.0-1) ... Selecting previously unselected package python3-pygments. Preparing to unpack .../46-python3-pygments_2.18.0+dfsg-2_all.deb ... Unpacking python3-pygments (2.18.0+dfsg-2) ... Selecting previously unselected package python3-pyparsing. Preparing to unpack .../47-python3-pyparsing_3.1.3-1_all.deb ... Unpacking python3-pyparsing (3.1.3-1) ... Selecting previously unselected package python3-pytest. Preparing to unpack .../48-python3-pytest_8.4.2-1_all.deb ... Unpacking python3-pytest (8.4.2-1) ... Selecting previously unselected package python3-rdflib. Preparing to unpack .../49-python3-rdflib_7.1.1-3_all.deb ... Unpacking python3-rdflib (7.1.1-3) ... Setting up media-types (14.0.0) ... Setting up libpipeline1:arm64 (1.5.8-1) ... Setting up bsdextrautils (2.41.2-4) ... Setting up libmagic-mgc (1:5.46-5) ... Setting up libarchive-zip-perl (1.68-1) ... Setting up libxml2-16:arm64 (2.14.6+dfsg-0.1) ... Setting up libdebhelper-perl (13.28) ... Setting up libmagic1t64:arm64 (1:5.46-5) ... Setting up gettext-base (0.23.1-2+b1) ... Setting up m4 (1.4.20-2) ... Setting up file (1:5.46-5) ... Setting up libelf1t64:arm64 (0.193-3) ... Setting up tzdata (2025b-5) ... Current default time zone: 'Etc/UTC' Local time is now: Fri Oct 31 21:48:43 UTC 2025. Universal Time is now: Fri Oct 31 21:48:43 UTC 2025. Run 'dpkg-reconfigure tzdata' if you wish to change it. Setting up autotools-dev (20240727.1) ... Setting up libunistring5:arm64 (1.3-2) ... Setting up autopoint (0.23.1-2) ... Setting up autoconf (2.72-3.1) ... Setting up libffi8:arm64 (3.5.2-2) ... Setting up dwz (0.16-2) ... Setting up sensible-utils (0.0.26) ... Setting up libuchardet0:arm64 (0.0.8-2) ... Setting up netbase (6.5) ... Setting up readline-common (8.3-3) ... Setting up automake (1:1.18.1-2) ... update-alternatives: using /usr/bin/automake-1.18 to provide /usr/bin/automake (automake) in auto mode Setting up libfile-stripnondeterminism-perl (1.15.0-1) ... Setting up gettext (0.23.1-2+b1) ... Setting up libtool (2.5.4-7) ... Setting up intltool-debian (0.35.0+20060710.6) ... Setting up dh-autoreconf (21) ... Setting up libreadline8t64:arm64 (8.3-3) ... Setting up dh-strip-nondeterminism (1.15.0-1) ... Setting up groff-base (1.23.0-9) ... Setting up libpython3.13-stdlib:arm64 (3.13.9-1) ... Setting up libpython3-stdlib:arm64 (3.13.7-1) ... Setting up python3.13 (3.13.9-1) ... Setting up po-debconf (1.0.21+nmu1) ... Setting up python3 (3.13.7-1) ... Setting up python3-zipp (3.23.0-1) ... Setting up python3-autocommand (2.2.2-3) ... Setting up man-db (2.13.1-1) ... Not building database; man-db/auto-update is not 'true'. Setting up python3-pygments (2.18.0+dfsg-2) ... Setting up python3-packaging (25.0-1) ... Setting up python3-pyparsing (3.1.3-1) ... Setting up python3-typing-extensions (4.15.0-1) ... Setting up python3-pluggy (1.6.0-1) ... Setting up python3-rdflib (7.1.1-3) ... Setting up dh-python (6.20250414) ... Setting up python3-more-itertools (10.8.0-1) ... Setting up python3-iniconfig (2.1.0-1) ... Setting up python3-jaraco.functools (4.1.0-1) ... Setting up python3-jaraco.context (6.0.1-1) ... Setting up python3-pytest (8.4.2-1) ... Setting up python3-typeguard (4.4.4-1) ... Setting up python3-all (3.13.7-1) ... Setting up debhelper (13.28) ... Setting up python3-inflect (7.5.0-1) ... Setting up python3-jaraco.text (4.0.0-1) ... Setting up python3-pkg-resources (78.1.1-0.1) ... Setting up python3-setuptools (78.1.1-0.1) ... Processing triggers for libc-bin (2.41-12) ... Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... Building tag database... -> Finished parsing the build-deps I: Building the package I: Running cd /build/reproducible-path/sparql-wrapper-python-2.0.0/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-genchanges -S > ../sparql-wrapper-python_2.0.0-2_source.changes dpkg-buildpackage: info: source package sparql-wrapper-python dpkg-buildpackage: info: source version 2.0.0-2 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Alexandre Detiste dpkg-source --before-build . dpkg-buildpackage: info: host architecture arm64 debian/rules clean dh clean --buildsystem=pybuild dh_auto_clean -O--buildsystem=pybuild I: pybuild base:311: python3.13 setup.py clean /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running clean removing '/build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build' (and everything under it) 'build/bdist.linux-aarch64' does not exist -- can't clean it 'build/scripts-3.13' does not exist -- can't clean it dh_autoreconf_clean -O--buildsystem=pybuild dh_clean -O--buildsystem=pybuild debian/rules binary dh binary --buildsystem=pybuild dh_update_autotools_config -O--buildsystem=pybuild dh_autoreconf -O--buildsystem=pybuild dh_auto_configure -O--buildsystem=pybuild I: pybuild base:311: python3.13 setup.py config /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running config dh_auto_build -O--buildsystem=pybuild I: pybuild base:311: /usr/bin/python3 setup.py build /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running build running build_py creating /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper I: pybuild pybuild:334: cp -r test /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build debian/rules override_dh_auto_test make[1]: Entering directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' # tests need a remote server dh_auto_test || : I: pybuild base:311: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build; python3.13 -m pytest test ============================= test session starts ============================== platform linux -- Python 3.13.9, pytest-8.4.2, pluggy-1.6.0 rootdir: /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build configfile: pyproject.toml plugins: typeguard-4.4.4 collected 1525 items test/test_agrovoc-allegrograph_on_hold.py sFxxsFFsFFxsFFxxsFFFFxxsFFFFxx [ 1%] sFFFFxxsFFFFFFFFssFFFxxFFxFFxxFFF [ 4%] test/test_allegrograph__v4_14_1__mmi.py ssFFFFFFssFFFFssFFFFFFssFFFFFFss [ 6%] FFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFFFssFFFFFF [ 10%] FFFFFFFFFFFFFFFFFFFFFFF [ 12%] test/test_blazegraph__wikidata.py ssFFFFFFssFFFFssFFFFFFssFFFFFFsFsFsFFF [ 14%] sFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFFsFsFFFFFFFsFF [ 19%] FFFsFFFFFFFsFFFFF [ 20%] test/test_cli.py ..F...FFFFFFFFFFFFFFFFFFFFFF [ 22%] test/test_fuseki2__v3_6_0__agrovoc.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFF [ 24%] sFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFs [ 29%] FFFFFFFFFFsFFsFFFFFFF [ 30%] test/test_fuseki2__v3_8_0__stw.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFFsFsF [ 33%] sFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFsFFFF [ 38%] FFFFFFsFFsFFFFFFF [ 39%] test/test_graphdbEnterprise__v8_9_0__rs.py ssssFFsFsssFsFssssFFsFsssFsFs [ 41%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFsFFsFsF [ 45%] ssFFsFsFsFsFsFssFFsFsFsFsF [ 47%] test/test_lov-fuseki_on_hold.py FFFFFFFFFFFFFFssssssssssssssFFFFFFFFFFFF [ 50%] FFFFssssssssssssssssFFFFFFFFFFFFFFFFssssssssssssssssFsFFssFFFFFFFFFFFFFF [ 54%] Fssssssssssssss [ 55%] test/test_rdf4j__geosciml.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 58%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFssFFsFsFssFFsFsFsFsFs [ 63%] FssFFsFsFsFsF [ 64%] test/test_stardog__lindas.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 67%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFssFFFsFsFssFFsFsFsFsFs [ 71%] FssFFsFsFsFsF [ 72%] test/test_store__v1_1_4.py FFFsFFsFsFxFxFxxxxxxxxxxxxxxsFsssFsFsFsFxFxFx [ 75%] xssxxxxxxxxxxxxsFsssFsssFssxFxFxxssxxxxxxxxxxxxFFFFssFFFFsFFsFsFxFxFxxxx [ 80%] xxxxxxxxxx [ 81%] test/test_virtuoso__v7_20_3230__dbpedia.py FFFssFssFFFFFFsssssFsssssssss [ 82%] FFFssFFFFFFFFFFsFssssFssssssFsssFFFssFFFFFFFFFFssssssssssssssssFFFFssFFF [ 87%] FFFFssFFFFFFsssFFsssssssss [ 89%] test/test_virtuoso__v8_03_3313__dbpedia.py FFFssFssFFFFFFsssssssssssssss [ 91%] FFFssFFFFFFFFFFsssssssssssssssssFFFssFFFFFFFFFFssssssssssssssssFFFFFsFFF [ 96%] FFFFssFFFFFFssssssssssssss [ 97%] test/test_wrapper.py ....s..........................F... [100%] =================================== FAILURES =================================== ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:345: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinJSONLD ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD(self): > result = self.__generic(askQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:513: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:499: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:485: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:520: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:601: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:643: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:629: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:724: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:615: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3(self): > result = self.__generic(describeQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:650: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:636: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:732: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_agrovoc-allegrograph_on_hold.py:757: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:742: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:748: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:745: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:769: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:239: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:647: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:658: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:579: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:603: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:614: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:689: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:698: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:468: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:586: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:669: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:680: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected(self): > result = self.__generic(askQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:625: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:636: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:707: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:716: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:476: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:484: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:885: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:894: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:921: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:930: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:823: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:830: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:760: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:768: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:956: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:964: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:738: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:903: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:939: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:948: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:837: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:844: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:784: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:972: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:980: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:745: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:752: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1158: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1086: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1093: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1023: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1031: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:994: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1001: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1176: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1203: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3(self): > result = self.__generic(describeQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1039: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1047: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1008: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1015: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_allegrograph__v4_14_1__mmi.py:1269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:252: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:301: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:387: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:343: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:273: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:280: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:418: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:427: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:213: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:398: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:322: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:436: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:445: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:229: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:580: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:655: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:666: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:706: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:484: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:594: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:677: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:688: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:601: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected(self): > result = self.__generic(askQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:644: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:715: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:724: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:508: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:915: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:924: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:887: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:962: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:849: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:768: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:811: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:990: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:998: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:739: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:746: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:933: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:942: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:906: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:982: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:868: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:784: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:792: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:830: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1006: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1014: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:753: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:760: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1210: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1138: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1057: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1065: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1276: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1028: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1035: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1219: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1157: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1073: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1081: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1049: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_blazegraph__wikidata.py:1328: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_blazegraph__wikidata.py:1310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:325: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:400: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:367: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:301: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:442: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:339: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:422: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:389: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:318: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLIParser_Test.testInvalidFormat _________________ self = def testInvalidFormat(self): with self.assertRaises(SystemExit) as cm: parse_args(["-Q", testquery, "-F", "jjssoonn"]) self.assertEqual(cm.exception.code, 2) > self.assertEqual( sys.stderr.getvalue().split("\n")[1], "rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld')", ) E AssertionError: "rqw:[65 chars]from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld)" != "rqw:[65 chars]from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rd[28 chars]ld')" E - rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld) E + rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld') E ? + + + + + + + + + + + + + + + + + + test/test_cli.py:79: AssertionError ______________________ SPARQLWrapperCLI_Test.testQueryRDF ______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryRDF(self): > main(["-Q", "DESCRIBE ", "-e", endpoint, "-F", "rdf"]) test/test_cli.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperCLI_Test.testQueryTo4store ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryTo4store(self): > main(["-e", "http://rdf.chise.org/sparql", "-Q", testquery]) test/test_cli.py:627: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperCLI_Test.testQueryToAgrovoc_AllegroGraph _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToAgrovoc_AllegroGraph(self): > main(["-e", "https://agrovoc.fao.org/sparql", "-Q", testquery]) test/test_cli.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLI_Test.testQueryToAllegroGraph _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToAllegroGraph(self): > main(["-e", "https://mmisw.org/sparql", "-Q", testquery]) test/test_cli.py:378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToBrazeGraph __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToBrazeGraph(self): > main(["-e", "https://query.wikidata.org/sparql", "-Q", testquery]) test/test_cli.py:546: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_6 _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToFuseki2V3_6(self): > main(["-e", "https://agrovoc.uniroma2.it/sparql/", "-Q", testquery]) test/test_cli.py:573: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_8 _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToFuseki2V3_8(self): > main(["-e", "http://zbw.eu/beta/sparql/stw/query", "-Q", testquery]) test/test_cli.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperCLI_Test.testQueryToGraphDBEnterprise ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToGraphDBEnterprise(self): > main(["-e", "http://factforge.net/repositories/ff-news", "-Q", testquery]) test/test_cli.py:405: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryToLovFuseki __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToLovFuseki(self): > main(["-e", "https://lov.linkeddata.es/dataset/lov/sparql/", "-Q", testquery]) test/test_cli.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperCLI_Test.testQueryToRDF4J ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToRDF4J(self): > main( [ "-e", "http://vocabs.ands.org.au/repository/api/sparql/csiro_international-chronostratigraphic-chart_2018-revised-corrected", "-Q", testquery, ] ) test/test_cli.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperCLI_Test.testQueryToStardog ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToStardog(self): > main(["-e", "https://lindas.admin.ch/query", "-Q", testquery, "-m", POST]) test/test_cli.py:432: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV7 __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToVirtuosoV7(self): > main(["-e", "http://dbpedia.org/sparql", "-Q", testquery]) test/test_cli.py:516: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV8 __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToVirtuosoV8(self): > main(["-e", "http://dbpedia-live.openlinksw.com/sparql", "-Q", testquery]) test/test_cli.py:486: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryWithEndpoint __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithEndpoint(self): > main( [ "-Q", testquery, "-e", endpoint, ] ) test/test_cli.py:97: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperCLI_Test.testQueryWithFile ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFile(self): > main(["-f", testfile, "-e", endpoint]) test/test_cli.py:135: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileCSV __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileCSV(self): > main(["-f", testfile, "-e", endpoint, "-F", "csv"]) test/test_cli.py:291: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileN3 ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileN3(self): > main(["-f", testfile, "-e", endpoint, "-F", "n3"]) test/test_cli.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLI_Test.testQueryWithFileRDFXML _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileRDFXML(self): > main(["-f", testfile, "-e", endpoint, "-F", "rdf+xml"]) test/test_cli.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileTSV __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileTSV(self): > main(["-f", testfile, "-e", endpoint, "-F", "tsv"]) test/test_cli.py:304: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLI_Test.testQueryWithFileTurtle _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileTurtle(self): > main(["-f", testfile, "-e", endpoint, "-F", "turtle"]) test/test_cli.py:188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperCLI_Test.testQueryWithFileTurtleQuiet ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileTurtleQuiet(self): > main( [ "-f", testfile, "-e", endpoint, "-F", "turtle", "-q", ] ) test/test_cli.py:205: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileXML(self): > main(["-f", testfile, "-e", endpoint, "-F", "xml"]) test/test_cli.py:167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:496: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:545: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:629: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:552: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:517: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:524: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:658: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:457: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:465: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV(self): > result = self.__generic(askQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:503: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:510: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:559: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:650: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:566: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:608: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV(self): > result = self.__generic(askQuery, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:531: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:538: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:676: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:685: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:473: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:481: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:874: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:831: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:839: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:901: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:910: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:806: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:738: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:772: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:935: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:943: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:700: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:707: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:893: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:847: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:855: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:918: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:927: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:823: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:755: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:951: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:959: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:714: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:721: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1143: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1110: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1170: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1079: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1011: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1045: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:973: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:980: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1096: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1028: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1062: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:987: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:994: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_fuseki2__v3_6_0__agrovoc.py:1253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1238: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:302: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:309: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:415: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:424: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:288: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:295: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:442: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:230: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:238: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:493: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:549: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:591: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:528: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:662: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:671: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:461: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV(self): > result = self.__generic(askQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:514: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:563: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:654: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:570: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:612: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV(self): > result = self.__generic(askQuery, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:535: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:542: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:680: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:689: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:477: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:485: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:878: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:835: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:843: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:905: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:914: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:810: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:742: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:939: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:947: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:704: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:711: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:897: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:851: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:859: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:922: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:931: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:827: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:759: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:793: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:955: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:963: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:718: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:725: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1114: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1183: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1083: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1015: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1049: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:977: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:984: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1128: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1032: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1066: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:991: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:998: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_fuseki2__v3_8_0__stw.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1242: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1261: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:250: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:306: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:348: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:419: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:226: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:264: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:320: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:369: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:299: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:437: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:446: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:234: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:242: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:663: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:585: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:621: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:702: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:488: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:684: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:642: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:721: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:505: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:898: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:864: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:936: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:834: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:774: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:804: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:972: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:744: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:917: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:879: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:955: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:849: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:819: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:989: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:759: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1132: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1102: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1072: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1012: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1223: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1057: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1087: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1027: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_graphdbEnterprise__v8_9_0__rs.py:1286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1290: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1298: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:299: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:445: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:314: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:536: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:543: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:604: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:687: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:641: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:570: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:577: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:740: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:498: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:967: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:976: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:936: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1010: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1019: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:890: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:898: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:814: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:822: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:852: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:860: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1052: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1060: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:779: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:786: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1280: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1131: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1139: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1177: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1373: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1096: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_lov-fuseki_on_hold.py:1423: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1414: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1443: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:416: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:360: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:370: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:450: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:674: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:591: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:628: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:716: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:606: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:735: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:511: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:878: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:950: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:848: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:818: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:986: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:758: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:931: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:893: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:969: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:863: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:803: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1003: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:773: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1146: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1116: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1056: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1086: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1025: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1131: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1071: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1101: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1041: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_rdf4j__geosciml.py:1305: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryBadFormed_1 ____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed_1(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed_1, XML, GET) test/test_rdf4j__geosciml.py:1282: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1309: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:363: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:432: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:250: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:678: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:595: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:632: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:720: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:498: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:701: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:610: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:655: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:739: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:515: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:916: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:882: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:954: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:852: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:792: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:822: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:990: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:762: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:935: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:897: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:973: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:867: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:807: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:837: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1007: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:777: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1183: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1059: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1089: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1029: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1202: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1164: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1134: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1074: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1104: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1044: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_stardog__lindas.py:1307: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1298: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1293: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:367: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:455: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:436: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:345: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:520: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:527: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:583: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:673: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:590: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:627: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:560: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:718: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:942: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:981: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:797: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:834: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1020: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:763: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1097: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1062: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_store__v1_1_4.py:1371: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_store__v1_1_4.py:1356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1362: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1387: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:247: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:357: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:448: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:526: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:533: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:586: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:563: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:728: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:737: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:608: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:950: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:895: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:904: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:866: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:873: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:802: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:809: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:841: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1048: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1056: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:772: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:779: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:977: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:880: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1073: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1364: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1372: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1087: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1094: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_virtuoso__v7_20_3230__dbpedia.py:1416: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1401: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1404: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:448: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:457: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:439: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:528: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:535: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:588: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:595: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:565: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:740: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:502: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:954: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:899: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:908: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:869: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:876: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:805: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:812: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:836: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:844: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1053: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1061: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:775: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:782: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1272: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1226: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1123: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1154: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1370: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1093: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_virtuoso__v8_03_3313__dbpedia.py:1422: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1426: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:450: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________________ QueryResult_Test.testConvert _________________________ self = def testConvert(self): class FakeResponse(object): def __init__(self, content_type): self.content_type = content_type def info(self): return {"content-type": self.content_type} def read(self, len): return "" def _mime_vs_type(mime, requested_type): """ :param mime: mimetype/Content-Type of the response :param requested_type: requested mimetype (alias) :return: number of warnings produced by combo """ with warnings.catch_warnings(record=True) as w: qr = QueryResult((FakeResponse(mime), requested_type)) try: qr.convert() except: pass # if len(w) > 0: print(w[0].message) # FOR DEBUG # if len(w) > 1: print(w[1].message) # FOR DEBUG return len(w) # In the cases of "application/ld+json" and "application/rdf+xml", the # RDFLib raised a warning because the manually created QueryResult has no real # response value (implemented a fake read). # "WARNING:rdflib.term: does not look like a valid URI, trying to serialize this will break." self.assertEqual(0, _mime_vs_type("application/sparql-results+xml", XML)) self.assertEqual(0, _mime_vs_type("application/sparql-results+json", JSON)) self.assertEqual(0, _mime_vs_type("text/n3", N3)) self.assertEqual(0, _mime_vs_type("text/turtle", TURTLE)) self.assertEqual(0, _mime_vs_type("application/turtle", TURTLE)) self.assertEqual(0, _mime_vs_type("application/json", JSON)) > self.assertEqual(0, _mime_vs_type("application/ld+json", JSONLD)) E AssertionError: 0 != 1 test/test_wrapper.py:876: AssertionError =============================== warnings summary =============================== test/test_agrovoc-allegrograph_on_hold.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_agrovoc-allegrograph_on_hold.py:167: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_allegrograph__v4_14_1__mmi.py:166 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_allegrograph__v4_14_1__mmi.py:166: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_blazegraph__wikidata.py:175 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_blazegraph__wikidata.py:175: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_fuseki2__v3_6_0__agrovoc.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_fuseki2__v3_6_0__agrovoc.py:167: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_fuseki2__v3_8_0__stw.py:168 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_fuseki2__v3_8_0__stw.py:168: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_graphdbEnterprise__v8_9_0__rs.py:179 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_graphdbEnterprise__v8_9_0__rs.py:179: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_lov-fuseki_on_hold.py:170 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_lov-fuseki_on_hold.py:170: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_rdf4j__geosciml.py:176 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_rdf4j__geosciml.py:176: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_stardog__lindas.py:180 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_stardog__lindas.py:180: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_store__v1_1_4.py:165 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_store__v1_1_4.py:165: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_virtuoso__v7_20_3230__dbpedia.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_virtuoso__v7_20_3230__dbpedia.py:167: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_virtuoso__v8_03_3313__dbpedia.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_virtuoso__v8_03_3313__dbpedia.py:167: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'ASK' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'ASK' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 4 warnings test/test_allegrograph__v4_14_1__mmi.py: 8 warnings test/test_blazegraph__wikidata.py: 8 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings test/test_fuseki2__v3_8_0__stw.py: 8 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings test/test_lov-fuseki_on_hold.py: 8 warnings test/test_rdf4j__geosciml.py: 4 warnings test/test_stardog__lindas.py: 4 warnings test/test_store__v1_1_4.py: 8 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 8 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 8 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'foo'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 4 warnings test/test_allegrograph__v4_14_1__mmi.py: 8 warnings test/test_blazegraph__wikidata.py: 8 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings test/test_fuseki2__v3_8_0__stw.py: 8 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings test/test_rdf4j__geosciml.py: 4 warnings test/test_stardog__lindas.py: 4 warnings test/test_store__v1_1_4.py: 8 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'bar'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 2 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'CONSTRUCT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 2 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings test/test_fuseki2__v3_8_0__stw.py: 4 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'CONSTRUCT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 2 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings test/test_fuseki2__v3_8_0__stw.py: 4 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 1 warning test/test_allegrograph__v4_14_1__mmi.py: 1 warning test/test_blazegraph__wikidata.py: 1 warning test/test_fuseki2__v3_6_0__agrovoc.py: 1 warning test/test_fuseki2__v3_8_0__stw.py: 1 warning test/test_graphdbEnterprise__v8_9_0__rs.py: 1 warning test/test_lov-fuseki_on_hold.py: 1 warning test/test_rdf4j__geosciml.py: 1 warning test/test_stardog__lindas.py: 1 warning test/test_store__v1_1_4.py: 1 warning test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:663: UserWarning: keepalive support not available, so the execution of this method has no effect warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 4 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 2 warnings test/test_wrapper.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'SELECT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_cli.py: 1 warning test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'SELECT' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf+xml' in a 'SELECT' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'turtle' in a 'SELECT' SPARQL query form warnings.warn( -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSONLD FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_cli.py::SPARQLWrapperCLIParser_Test::testInvalidFormat - Ass... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF - urllib.error.U... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryTo4store - urllib.er... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAgrovoc_AllegroGraph FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAllegroGraph - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToBrazeGraph - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_6 - urll... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_8 - urll... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToGraphDBEnterprise FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToLovFuseki - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToRDF4J - urllib.err... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToStardog - urllib.e... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV7 - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV8 - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithEndpoint - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFile - urllib.er... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileCSV - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileN3 - urllib.... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTSV - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtleQuiet FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileXML - urllib... FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testKeepAlive - u... FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testKeepAlive - urll... FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryBadFormed_1 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testKeepAlive - urll... FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV - ur... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON - u... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testKeepAlive - urllib... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryBadFormed - u... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_wrapper.py::QueryResult_Test::testConvert - AssertionError: ... = 858 failed, 38 passed, 549 skipped, 80 xfailed, 381 warnings in 760.92s (0:12:40) = E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build; python3.13 -m pytest test dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.13 returned exit code 13 make[1]: Leaving directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' create-stamp debian/debhelper-build-stamp dh_testroot -O--buildsystem=pybuild dh_prep -O--buildsystem=pybuild dh_auto_install --destdir=debian/python3-sparqlwrapper/ -O--buildsystem=pybuild I: pybuild pybuild:308: rm -fr /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test I: pybuild base:311: /usr/bin/python3 setup.py install --root /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running install /usr/lib/python3/dist-packages/setuptools/_distutils/cmd.py:90: SetuptoolsDeprecationWarning: setup.py install is deprecated. !! ******************************************************************************** Please avoid running ``setup.py`` directly. Instead, use pypa/build, pypa/installer or other standards-based tools. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. ******************************************************************************** !! self.initialize_options() running build running build_py running install_lib creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/v/cache/lastfailed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/v/cache/nodeids -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/CACHEDIR.TAG -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/.gitignore -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/README.md -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/main.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/sparql_dataframe.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/SPARQLExceptions.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/KeyCaseInsensitiveDict.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/Wrapper.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/SmartWrapper.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/__init__.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/KeyCaseInsensitiveDict.py to KeyCaseInsensitiveDict.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/SPARQLExceptions.py to SPARQLExceptions.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/SmartWrapper.py to SmartWrapper.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/Wrapper.py to Wrapper.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__init__.py to __init__.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/main.py to main.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/sparql_dataframe.py to sparql_dataframe.cpython-313.pyc running install_egg_info running egg_info creating SPARQLWrapper.egg-info writing SPARQLWrapper.egg-info/PKG-INFO writing dependency_links to SPARQLWrapper.egg-info/dependency_links.txt writing entry points to SPARQLWrapper.egg-info/entry_points.txt writing requirements to SPARQLWrapper.egg-info/requires.txt writing top-level names to SPARQLWrapper.egg-info/top_level.txt writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files found matching 'Makefile' warning: no directories found matching 'docs/build/html' adding license file 'LICENSE.txt' adding license file 'AUTHORS.md' writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' Copying SPARQLWrapper.egg-info to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper-2.0.0.egg-info Skipping SOURCES.txt running install_scripts Installing rqw script to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/bin dh_installdocs -O--buildsystem=pybuild dh_installchangelogs -O--buildsystem=pybuild dh_installexamples -O--buildsystem=pybuild dh_python3 -O--buildsystem=pybuild I: dh_python3 tools:114: replacing shebang in debian/python3-sparqlwrapper/usr/bin/rqw dh_installsystemduser -O--buildsystem=pybuild dh_perl -O--buildsystem=pybuild dh_link -O--buildsystem=pybuild dh_strip_nondeterminism -O--buildsystem=pybuild dh_compress -O--buildsystem=pybuild dh_fixperms -O--buildsystem=pybuild dh_missing -O--buildsystem=pybuild dh_installdeb -O--buildsystem=pybuild dh_gencontrol -O--buildsystem=pybuild dh_md5sums -O--buildsystem=pybuild dh_builddeb -O--buildsystem=pybuild dpkg-deb: building package 'python3-sparqlwrapper' in '../python3-sparqlwrapper_2.0.0-2_all.deb'. dpkg-genbuildinfo --build=binary -O../sparql-wrapper-python_2.0.0-2_arm64.buildinfo dpkg-genchanges --build=binary -O../sparql-wrapper-python_2.0.0-2_arm64.changes dpkg-genchanges: info: binary-only upload (no source code included) dpkg-source --after-build . dpkg-buildpackage: info: binary-only upload (no source included) dpkg-genchanges: info: not including original source code in upload I: copying local configuration I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env I: removing directory /srv/workspace/pbuilder/4132079 and its subdirectories I: Current time: Fri Oct 31 10:01:44 -12 2025 I: pbuilder-time-stamp: 1761948104 Fri Oct 31 22:01:44 UTC 2025 I: Signing ./b1/sparql-wrapper-python_2.0.0-2_arm64.buildinfo as sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc Fri Oct 31 22:01:44 UTC 2025 I: Signed ./b1/sparql-wrapper-python_2.0.0-2_arm64.buildinfo as ./b1/sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc Fri Oct 31 22:01:44 UTC 2025 - build #1 for sparql-wrapper-python/forky/arm64 on codethink04-arm64 done. Starting cleanup. All cleanup done. Fri Oct 31 22:01:44 UTC 2025 - reproducible_build.sh stopped running as /tmp/jenkins-script-mEuEB89r, removing. /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy: total 16 drwxrwxr-x 2 jenkins jenkins 4096 Oct 31 22:01 b1 drwxrwxr-x 2 jenkins jenkins 4096 Oct 31 21:48 b2 -rw------- 1 jenkins jenkins 3417 Oct 31 21:48 rbuildlog.r4FpzJa -rw-rw-r-- 1 jenkins jenkins 2214 Jun 26 2024 sparql-wrapper-python_2.0.0-2.dsc /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b1: total 7756 -rw-r--r-- 1 jenkins jenkins 7862197 Oct 31 22:01 build.log -rw-r--r-- 1 jenkins jenkins 39232 Oct 31 22:01 python3-sparqlwrapper_2.0.0-2_all.deb -rw-r--r-- 1 jenkins jenkins 5692 Oct 31 22:01 sparql-wrapper-python_2.0.0-2.debian.tar.xz -rw-r--r-- 1 jenkins jenkins 2214 Oct 31 22:01 sparql-wrapper-python_2.0.0-2.dsc -rw-r--r-- 1 jenkins jenkins 5639 Oct 31 22:01 sparql-wrapper-python_2.0.0-2_arm64.buildinfo -rw-rw-r-- 1 jenkins jenkins 6521 Oct 31 22:01 sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc -rw-r--r-- 1 jenkins jenkins 1127 Oct 31 22:01 sparql-wrapper-python_2.0.0-2_arm64.changes -rw-r--r-- 1 jenkins jenkins 1315 Oct 31 22:01 sparql-wrapper-python_2.0.0-2_source.changes /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b2: total 0 Fri Oct 31 22:01:46 UTC 2025 I: Deleting $TMPDIR on codethink04-arm64.debian.net. I: pbuilder: network access will be disabled during build I: Current time: Fri Oct 31 09:48:24 -12 2025 I: pbuilder-time-stamp: 1761947304 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/forky-reproducible-base.tgz] I: copying local configuration W: --override-config is not set; not updating apt.conf Read the manpage for details. I: mounting /proc filesystem I: mounting /sys filesystem I: creating /{dev,run}/shm I: mounting /dev/pts filesystem I: redirecting /dev/ptmx to /dev/pts/ptmx I: policy-rc.d already exists I: Copying source file I: copying [sparql-wrapper-python_2.0.0-2.dsc] I: copying [./sparql-wrapper-python_2.0.0.orig.tar.gz] I: copying [./sparql-wrapper-python_2.0.0-2.debian.tar.xz] I: Extracting source dpkg-source: warning: cannot verify inline signature for ./sparql-wrapper-python_2.0.0-2.dsc: no acceptable signature found dpkg-source: info: extracting sparql-wrapper-python in sparql-wrapper-python-2.0.0 dpkg-source: info: unpacking sparql-wrapper-python_2.0.0.orig.tar.gz dpkg-source: info: unpacking sparql-wrapper-python_2.0.0-2.debian.tar.xz I: Not using root during the build. I: Installing the build-deps I: user script /srv/workspace/pbuilder/4132079/tmp/hooks/D02_print_environment starting I: set BUILDDIR='/build/reproducible-path' BUILDUSERGECOS='first user,first room,first work-phone,first home-phone,first other' BUILDUSERNAME='pbuilder1' BUILD_ARCH='arm64' DEBIAN_FRONTEND='noninteractive' DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=12 ' DISTRIBUTION='forky' HOME='/root' HOST_ARCH='arm64' IFS=' ' INVOCATION_ID='7c7dec4a095148debb57cd30c0788beb' LANG='C' LANGUAGE='en_US:en' LC_ALL='C' MAIL='/var/mail/root' OPTIND='1' PATH='/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' PBCURRENTCOMMANDLINEOPERATION='build' PBUILDER_OPERATION='build' PBUILDER_PKGDATADIR='/usr/share/pbuilder' PBUILDER_PKGLIBDIR='/usr/lib/pbuilder' PBUILDER_SYSCONFDIR='/etc' PPID='4132079' PS1='# ' PS2='> ' PS4='+ ' PWD='/' SHELL='/bin/bash' SHLVL='2' SUDO_COMMAND='/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/pbuilderrc_fN2b --distribution forky --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/forky-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b1 --logfile b1/build.log sparql-wrapper-python_2.0.0-2.dsc' SUDO_GID='109' SUDO_HOME='/var/lib/jenkins' SUDO_UID='104' SUDO_USER='jenkins' TERM='unknown' TZ='/usr/share/zoneinfo/Etc/GMT+12' USER='root' _='/usr/bin/systemd-run' http_proxy='http://192.168.101.4:3128' I: uname -a Linux codethink04-arm64 6.12.48+deb13-cloud-arm64 #1 SMP Debian 6.12.48-1 (2025-09-20) aarch64 GNU/Linux I: ls -l /bin lrwxrwxrwx 1 root root 7 Aug 10 12:30 /bin -> usr/bin I: user script /srv/workspace/pbuilder/4132079/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy Version: 0.invalid.0 Architecture: arm64 Maintainer: Debian Pbuilder Team Description: Dummy package to satisfy dependencies with aptitude - created by pbuilder This package was created automatically by pbuilder to satisfy the build-dependencies of the package being currently built. Depends: debhelper-compat (= 13), dh-sequence-python3, python3-all, python3-pytest, python3-setuptools, python3-rdflib dpkg-deb: building package 'pbuilder-satisfydepends-dummy' in '/tmp/satisfydepends-aptitude/pbuilder-satisfydepends-dummy.deb'. Selecting previously unselected package pbuilder-satisfydepends-dummy. (Reading database ... 19971 files and directories currently installed.) Preparing to unpack .../pbuilder-satisfydepends-dummy.deb ... Unpacking pbuilder-satisfydepends-dummy (0.invalid.0) ... dpkg: pbuilder-satisfydepends-dummy: dependency problems, but configuring anyway as you requested: pbuilder-satisfydepends-dummy depends on debhelper-compat (= 13); however: Package debhelper-compat is not installed. pbuilder-satisfydepends-dummy depends on dh-sequence-python3; however: Package dh-sequence-python3 is not installed. pbuilder-satisfydepends-dummy depends on python3-all; however: Package python3-all is not installed. pbuilder-satisfydepends-dummy depends on python3-pytest; however: Package python3-pytest is not installed. pbuilder-satisfydepends-dummy depends on python3-setuptools; however: Package python3-setuptools is not installed. pbuilder-satisfydepends-dummy depends on python3-rdflib; however: Package python3-rdflib is not installed. Setting up pbuilder-satisfydepends-dummy (0.invalid.0) ... Reading package lists... Building dependency tree... Reading state information... Initializing package states... Writing extended state information... Building tag database... pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) The following NEW packages will be installed: autoconf{a} automake{a} autopoint{a} autotools-dev{a} bsdextrautils{a} debhelper{a} dh-autoreconf{a} dh-python{a} dh-strip-nondeterminism{a} dwz{a} file{a} gettext{a} gettext-base{a} groff-base{a} intltool-debian{a} libarchive-zip-perl{a} libdebhelper-perl{a} libelf1t64{a} libexpat1{a} libffi8{a} libfile-stripnondeterminism-perl{a} libmagic-mgc{a} libmagic1t64{a} libpipeline1{a} libpython3-stdlib{a} libpython3.13-minimal{a} libpython3.13-stdlib{a} libreadline8t64{a} libtool{a} libuchardet0{a} libunistring5{a} libxml2-16{a} m4{a} man-db{a} media-types{a} netbase{a} po-debconf{a} python3{a} python3-all{a} python3-autocommand{a} python3-inflect{a} python3-iniconfig{a} python3-jaraco.context{a} python3-jaraco.functools{a} python3-jaraco.text{a} python3-minimal{a} python3-more-itertools{a} python3-packaging{a} python3-pkg-resources{a} python3-pluggy{a} python3-pygments{a} python3-pyparsing{a} python3-pytest{a} python3-rdflib{a} python3-setuptools{a} python3-typeguard{a} python3-typing-extensions{a} python3-zipp{a} python3.13{a} python3.13-minimal{a} readline-common{a} sensible-utils{a} tzdata{a} The following packages are RECOMMENDED but will NOT be installed: ca-certificates curl libarchive-cpio-perl libltdl-dev libmail-sendmail-perl lynx python3-html5rdf python3-lxml python3-networkx python3-orjson wget 0 packages upgraded, 63 newly installed, 0 to remove and 0 not upgraded. Need to get 20.5 MB of archives. After unpacking 88.4 MB will be used. Writing extended state information... Get: 1 http://deb.debian.org/debian forky/main arm64 libexpat1 arm64 2.7.3-1 [96.5 kB] Get: 2 http://deb.debian.org/debian forky/main arm64 libpython3.13-minimal arm64 3.13.9-1 [858 kB] Get: 3 http://deb.debian.org/debian forky/main arm64 python3.13-minimal arm64 3.13.9-1 [2061 kB] Get: 4 http://deb.debian.org/debian forky/main arm64 python3-minimal arm64 3.13.7-1 [27.2 kB] Get: 5 http://deb.debian.org/debian forky/main arm64 media-types all 14.0.0 [30.8 kB] Get: 6 http://deb.debian.org/debian forky/main arm64 netbase all 6.5 [12.4 kB] Get: 7 http://deb.debian.org/debian forky/main arm64 tzdata all 2025b-5 [260 kB] Get: 8 http://deb.debian.org/debian forky/main arm64 libffi8 arm64 3.5.2-2 [21.5 kB] Get: 9 http://deb.debian.org/debian forky/main arm64 readline-common all 8.3-3 [74.8 kB] Get: 10 http://deb.debian.org/debian forky/main arm64 libreadline8t64 arm64 8.3-3 [169 kB] Get: 11 http://deb.debian.org/debian forky/main arm64 libpython3.13-stdlib arm64 3.13.9-1 [1900 kB] Get: 12 http://deb.debian.org/debian forky/main arm64 python3.13 arm64 3.13.9-1 [764 kB] Get: 13 http://deb.debian.org/debian forky/main arm64 libpython3-stdlib arm64 3.13.7-1 [10.2 kB] Get: 14 http://deb.debian.org/debian forky/main arm64 python3 arm64 3.13.7-1 [28.3 kB] Get: 15 http://deb.debian.org/debian forky/main arm64 sensible-utils all 0.0.26 [27.0 kB] Get: 16 http://deb.debian.org/debian forky/main arm64 libmagic-mgc arm64 1:5.46-5 [338 kB] Get: 17 http://deb.debian.org/debian forky/main arm64 libmagic1t64 arm64 1:5.46-5 [103 kB] Get: 18 http://deb.debian.org/debian forky/main arm64 file arm64 1:5.46-5 [43.7 kB] Get: 19 http://deb.debian.org/debian forky/main arm64 gettext-base arm64 0.23.1-2+b1 [241 kB] Get: 20 http://deb.debian.org/debian forky/main arm64 libuchardet0 arm64 0.0.8-2 [69.0 kB] Get: 21 http://deb.debian.org/debian forky/main arm64 groff-base arm64 1.23.0-9 [1130 kB] Get: 22 http://deb.debian.org/debian forky/main arm64 bsdextrautils arm64 2.41.2-4 [97.3 kB] Get: 23 http://deb.debian.org/debian forky/main arm64 libpipeline1 arm64 1.5.8-1 [40.2 kB] Get: 24 http://deb.debian.org/debian forky/main arm64 man-db arm64 2.13.1-1 [1453 kB] Get: 25 http://deb.debian.org/debian forky/main arm64 m4 arm64 1.4.20-2 [315 kB] Get: 26 http://deb.debian.org/debian forky/main arm64 autoconf all 2.72-3.1 [494 kB] Get: 27 http://deb.debian.org/debian forky/main arm64 autotools-dev all 20240727.1 [60.2 kB] Get: 28 http://deb.debian.org/debian forky/main arm64 automake all 1:1.18.1-2 [877 kB] Get: 29 http://deb.debian.org/debian forky/main arm64 autopoint all 0.23.1-2 [770 kB] Get: 30 http://deb.debian.org/debian forky/main arm64 libdebhelper-perl all 13.28 [92.4 kB] Get: 31 http://deb.debian.org/debian forky/main arm64 libtool all 2.5.4-7 [540 kB] Get: 32 http://deb.debian.org/debian forky/main arm64 dh-autoreconf all 21 [12.2 kB] Get: 33 http://deb.debian.org/debian forky/main arm64 libarchive-zip-perl all 1.68-1 [104 kB] Get: 34 http://deb.debian.org/debian forky/main arm64 libfile-stripnondeterminism-perl all 1.15.0-1 [19.9 kB] Get: 35 http://deb.debian.org/debian forky/main arm64 dh-strip-nondeterminism all 1.15.0-1 [8812 B] Get: 36 http://deb.debian.org/debian forky/main arm64 libelf1t64 arm64 0.193-3 [189 kB] Get: 37 http://deb.debian.org/debian forky/main arm64 dwz arm64 0.16-2 [100 kB] Get: 38 http://deb.debian.org/debian forky/main arm64 libunistring5 arm64 1.3-2 [453 kB] Get: 39 http://deb.debian.org/debian forky/main arm64 libxml2-16 arm64 2.14.6+dfsg-0.1 [601 kB] Get: 40 http://deb.debian.org/debian forky/main arm64 gettext arm64 0.23.1-2+b1 [1612 kB] Get: 41 http://deb.debian.org/debian forky/main arm64 intltool-debian all 0.35.0+20060710.6 [22.9 kB] Get: 42 http://deb.debian.org/debian forky/main arm64 po-debconf all 1.0.21+nmu1 [248 kB] Get: 43 http://deb.debian.org/debian forky/main arm64 debhelper all 13.28 [941 kB] Get: 44 http://deb.debian.org/debian forky/main arm64 dh-python all 6.20250414 [116 kB] Get: 45 http://deb.debian.org/debian forky/main arm64 python3-all arm64 3.13.7-1 [1044 B] Get: 46 http://deb.debian.org/debian forky/main arm64 python3-autocommand all 2.2.2-3 [13.6 kB] Get: 47 http://deb.debian.org/debian forky/main arm64 python3-more-itertools all 10.8.0-1 [71.7 kB] Get: 48 http://deb.debian.org/debian forky/main arm64 python3-typing-extensions all 4.15.0-1 [92.4 kB] Get: 49 http://deb.debian.org/debian forky/main arm64 python3-typeguard all 4.4.4-1 [37.1 kB] Get: 50 http://deb.debian.org/debian forky/main arm64 python3-inflect all 7.5.0-1 [33.0 kB] Get: 51 http://deb.debian.org/debian forky/main arm64 python3-iniconfig all 2.1.0-1 [7432 B] Get: 52 http://deb.debian.org/debian forky/main arm64 python3-jaraco.functools all 4.1.0-1 [12.0 kB] Get: 53 http://deb.debian.org/debian forky/main arm64 python3-pkg-resources all 78.1.1-0.1 [224 kB] Get: 54 http://deb.debian.org/debian forky/main arm64 python3-jaraco.text all 4.0.0-1 [11.4 kB] Get: 55 http://deb.debian.org/debian forky/main arm64 python3-zipp all 3.23.0-1 [11.0 kB] Get: 56 http://deb.debian.org/debian forky/main arm64 python3-setuptools all 78.1.1-0.1 [738 kB] Get: 57 http://deb.debian.org/debian forky/main arm64 python3-jaraco.context all 6.0.1-1 [8276 B] Get: 58 http://deb.debian.org/debian forky/main arm64 python3-packaging all 25.0-1 [56.6 kB] Get: 59 http://deb.debian.org/debian forky/main arm64 python3-pluggy all 1.6.0-1 [27.1 kB] Get: 60 http://deb.debian.org/debian forky/main arm64 python3-pygments all 2.18.0+dfsg-2 [836 kB] Get: 61 http://deb.debian.org/debian forky/main arm64 python3-pyparsing all 3.1.3-1 [148 kB] Get: 62 http://deb.debian.org/debian forky/main arm64 python3-pytest all 8.4.2-1 [266 kB] Get: 63 http://deb.debian.org/debian forky/main arm64 python3-rdflib all 7.1.1-3 [472 kB] Fetched 20.5 MB in 0s (87.3 MB/s) Preconfiguring packages ... Selecting previously unselected package libexpat1:arm64. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19971 files and directories currently installed.) Preparing to unpack .../libexpat1_2.7.3-1_arm64.deb ... Unpacking libexpat1:arm64 (2.7.3-1) ... Selecting previously unselected package libpython3.13-minimal:arm64. Preparing to unpack .../libpython3.13-minimal_3.13.9-1_arm64.deb ... Unpacking libpython3.13-minimal:arm64 (3.13.9-1) ... Selecting previously unselected package python3.13-minimal. Preparing to unpack .../python3.13-minimal_3.13.9-1_arm64.deb ... Unpacking python3.13-minimal (3.13.9-1) ... Setting up libpython3.13-minimal:arm64 (3.13.9-1) ... Setting up libexpat1:arm64 (2.7.3-1) ... Setting up python3.13-minimal (3.13.9-1) ... Selecting previously unselected package python3-minimal. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 20305 files and directories currently installed.) Preparing to unpack .../0-python3-minimal_3.13.7-1_arm64.deb ... Unpacking python3-minimal (3.13.7-1) ... Selecting previously unselected package media-types. Preparing to unpack .../1-media-types_14.0.0_all.deb ... Unpacking media-types (14.0.0) ... Selecting previously unselected package netbase. Preparing to unpack .../2-netbase_6.5_all.deb ... Unpacking netbase (6.5) ... Selecting previously unselected package tzdata. Preparing to unpack .../3-tzdata_2025b-5_all.deb ... Unpacking tzdata (2025b-5) ... Selecting previously unselected package libffi8:arm64. Preparing to unpack .../4-libffi8_3.5.2-2_arm64.deb ... Unpacking libffi8:arm64 (3.5.2-2) ... Selecting previously unselected package readline-common. Preparing to unpack .../5-readline-common_8.3-3_all.deb ... Unpacking readline-common (8.3-3) ... Selecting previously unselected package libreadline8t64:arm64. Preparing to unpack .../6-libreadline8t64_8.3-3_arm64.deb ... Adding 'diversion of /lib/aarch64-linux-gnu/libhistory.so.8 to /lib/aarch64-linux-gnu/libhistory.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/aarch64-linux-gnu/libhistory.so.8.2 to /lib/aarch64-linux-gnu/libhistory.so.8.2.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/aarch64-linux-gnu/libreadline.so.8 to /lib/aarch64-linux-gnu/libreadline.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/aarch64-linux-gnu/libreadline.so.8.2 to /lib/aarch64-linux-gnu/libreadline.so.8.2.usr-is-merged by libreadline8t64' Unpacking libreadline8t64:arm64 (8.3-3) ... Selecting previously unselected package libpython3.13-stdlib:arm64. Preparing to unpack .../7-libpython3.13-stdlib_3.13.9-1_arm64.deb ... Unpacking libpython3.13-stdlib:arm64 (3.13.9-1) ... Selecting previously unselected package python3.13. Preparing to unpack .../8-python3.13_3.13.9-1_arm64.deb ... Unpacking python3.13 (3.13.9-1) ... Selecting previously unselected package libpython3-stdlib:arm64. Preparing to unpack .../9-libpython3-stdlib_3.13.7-1_arm64.deb ... Unpacking libpython3-stdlib:arm64 (3.13.7-1) ... Setting up python3-minimal (3.13.7-1) ... Selecting previously unselected package python3. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 21320 files and directories currently installed.) Preparing to unpack .../00-python3_3.13.7-1_arm64.deb ... Unpacking python3 (3.13.7-1) ... Selecting previously unselected package sensible-utils. Preparing to unpack .../01-sensible-utils_0.0.26_all.deb ... Unpacking sensible-utils (0.0.26) ... Selecting previously unselected package libmagic-mgc. Preparing to unpack .../02-libmagic-mgc_1%3a5.46-5_arm64.deb ... Unpacking libmagic-mgc (1:5.46-5) ... Selecting previously unselected package libmagic1t64:arm64. Preparing to unpack .../03-libmagic1t64_1%3a5.46-5_arm64.deb ... Unpacking libmagic1t64:arm64 (1:5.46-5) ... Selecting previously unselected package file. Preparing to unpack .../04-file_1%3a5.46-5_arm64.deb ... Unpacking file (1:5.46-5) ... Selecting previously unselected package gettext-base. Preparing to unpack .../05-gettext-base_0.23.1-2+b1_arm64.deb ... Unpacking gettext-base (0.23.1-2+b1) ... Selecting previously unselected package libuchardet0:arm64. Preparing to unpack .../06-libuchardet0_0.0.8-2_arm64.deb ... Unpacking libuchardet0:arm64 (0.0.8-2) ... Selecting previously unselected package groff-base. Preparing to unpack .../07-groff-base_1.23.0-9_arm64.deb ... Unpacking groff-base (1.23.0-9) ... Selecting previously unselected package bsdextrautils. Preparing to unpack .../08-bsdextrautils_2.41.2-4_arm64.deb ... Unpacking bsdextrautils (2.41.2-4) ... Selecting previously unselected package libpipeline1:arm64. Preparing to unpack .../09-libpipeline1_1.5.8-1_arm64.deb ... Unpacking libpipeline1:arm64 (1.5.8-1) ... Selecting previously unselected package man-db. Preparing to unpack .../10-man-db_2.13.1-1_arm64.deb ... Unpacking man-db (2.13.1-1) ... Selecting previously unselected package m4. Preparing to unpack .../11-m4_1.4.20-2_arm64.deb ... Unpacking m4 (1.4.20-2) ... Selecting previously unselected package autoconf. Preparing to unpack .../12-autoconf_2.72-3.1_all.deb ... Unpacking autoconf (2.72-3.1) ... Selecting previously unselected package autotools-dev. Preparing to unpack .../13-autotools-dev_20240727.1_all.deb ... Unpacking autotools-dev (20240727.1) ... Selecting previously unselected package automake. Preparing to unpack .../14-automake_1%3a1.18.1-2_all.deb ... Unpacking automake (1:1.18.1-2) ... Selecting previously unselected package autopoint. Preparing to unpack .../15-autopoint_0.23.1-2_all.deb ... Unpacking autopoint (0.23.1-2) ... Selecting previously unselected package libdebhelper-perl. Preparing to unpack .../16-libdebhelper-perl_13.28_all.deb ... Unpacking libdebhelper-perl (13.28) ... Selecting previously unselected package libtool. Preparing to unpack .../17-libtool_2.5.4-7_all.deb ... Unpacking libtool (2.5.4-7) ... Selecting previously unselected package dh-autoreconf. Preparing to unpack .../18-dh-autoreconf_21_all.deb ... Unpacking dh-autoreconf (21) ... Selecting previously unselected package libarchive-zip-perl. Preparing to unpack .../19-libarchive-zip-perl_1.68-1_all.deb ... Unpacking libarchive-zip-perl (1.68-1) ... Selecting previously unselected package libfile-stripnondeterminism-perl. Preparing to unpack .../20-libfile-stripnondeterminism-perl_1.15.0-1_all.deb ... Unpacking libfile-stripnondeterminism-perl (1.15.0-1) ... Selecting previously unselected package dh-strip-nondeterminism. Preparing to unpack .../21-dh-strip-nondeterminism_1.15.0-1_all.deb ... Unpacking dh-strip-nondeterminism (1.15.0-1) ... Selecting previously unselected package libelf1t64:arm64. Preparing to unpack .../22-libelf1t64_0.193-3_arm64.deb ... Unpacking libelf1t64:arm64 (0.193-3) ... Selecting previously unselected package dwz. Preparing to unpack .../23-dwz_0.16-2_arm64.deb ... Unpacking dwz (0.16-2) ... Selecting previously unselected package libunistring5:arm64. Preparing to unpack .../24-libunistring5_1.3-2_arm64.deb ... Unpacking libunistring5:arm64 (1.3-2) ... Selecting previously unselected package libxml2-16:arm64. Preparing to unpack .../25-libxml2-16_2.14.6+dfsg-0.1_arm64.deb ... Unpacking libxml2-16:arm64 (2.14.6+dfsg-0.1) ... Selecting previously unselected package gettext. Preparing to unpack .../26-gettext_0.23.1-2+b1_arm64.deb ... Unpacking gettext (0.23.1-2+b1) ... Selecting previously unselected package intltool-debian. Preparing to unpack .../27-intltool-debian_0.35.0+20060710.6_all.deb ... Unpacking intltool-debian (0.35.0+20060710.6) ... Selecting previously unselected package po-debconf. Preparing to unpack .../28-po-debconf_1.0.21+nmu1_all.deb ... Unpacking po-debconf (1.0.21+nmu1) ... Selecting previously unselected package debhelper. Preparing to unpack .../29-debhelper_13.28_all.deb ... Unpacking debhelper (13.28) ... Selecting previously unselected package dh-python. Preparing to unpack .../30-dh-python_6.20250414_all.deb ... Unpacking dh-python (6.20250414) ... Selecting previously unselected package python3-all. Preparing to unpack .../31-python3-all_3.13.7-1_arm64.deb ... Unpacking python3-all (3.13.7-1) ... Selecting previously unselected package python3-autocommand. Preparing to unpack .../32-python3-autocommand_2.2.2-3_all.deb ... Unpacking python3-autocommand (2.2.2-3) ... Selecting previously unselected package python3-more-itertools. Preparing to unpack .../33-python3-more-itertools_10.8.0-1_all.deb ... Unpacking python3-more-itertools (10.8.0-1) ... Selecting previously unselected package python3-typing-extensions. Preparing to unpack .../34-python3-typing-extensions_4.15.0-1_all.deb ... Unpacking python3-typing-extensions (4.15.0-1) ... Selecting previously unselected package python3-typeguard. Preparing to unpack .../35-python3-typeguard_4.4.4-1_all.deb ... Unpacking python3-typeguard (4.4.4-1) ... Selecting previously unselected package python3-inflect. Preparing to unpack .../36-python3-inflect_7.5.0-1_all.deb ... Unpacking python3-inflect (7.5.0-1) ... Selecting previously unselected package python3-iniconfig. Preparing to unpack .../37-python3-iniconfig_2.1.0-1_all.deb ... Unpacking python3-iniconfig (2.1.0-1) ... Selecting previously unselected package python3-jaraco.functools. Preparing to unpack .../38-python3-jaraco.functools_4.1.0-1_all.deb ... Unpacking python3-jaraco.functools (4.1.0-1) ... Selecting previously unselected package python3-pkg-resources. Preparing to unpack .../39-python3-pkg-resources_78.1.1-0.1_all.deb ... Unpacking python3-pkg-resources (78.1.1-0.1) ... Selecting previously unselected package python3-jaraco.text. Preparing to unpack .../40-python3-jaraco.text_4.0.0-1_all.deb ... Unpacking python3-jaraco.text (4.0.0-1) ... Selecting previously unselected package python3-zipp. Preparing to unpack .../41-python3-zipp_3.23.0-1_all.deb ... Unpacking python3-zipp (3.23.0-1) ... Selecting previously unselected package python3-setuptools. Preparing to unpack .../42-python3-setuptools_78.1.1-0.1_all.deb ... Unpacking python3-setuptools (78.1.1-0.1) ... Selecting previously unselected package python3-jaraco.context. Preparing to unpack .../43-python3-jaraco.context_6.0.1-1_all.deb ... Unpacking python3-jaraco.context (6.0.1-1) ... Selecting previously unselected package python3-packaging. Preparing to unpack .../44-python3-packaging_25.0-1_all.deb ... Unpacking python3-packaging (25.0-1) ... Selecting previously unselected package python3-pluggy. Preparing to unpack .../45-python3-pluggy_1.6.0-1_all.deb ... Unpacking python3-pluggy (1.6.0-1) ... Selecting previously unselected package python3-pygments. Preparing to unpack .../46-python3-pygments_2.18.0+dfsg-2_all.deb ... Unpacking python3-pygments (2.18.0+dfsg-2) ... Selecting previously unselected package python3-pyparsing. Preparing to unpack .../47-python3-pyparsing_3.1.3-1_all.deb ... Unpacking python3-pyparsing (3.1.3-1) ... Selecting previously unselected package python3-pytest. Preparing to unpack .../48-python3-pytest_8.4.2-1_all.deb ... Unpacking python3-pytest (8.4.2-1) ... Selecting previously unselected package python3-rdflib. Preparing to unpack .../49-python3-rdflib_7.1.1-3_all.deb ... Unpacking python3-rdflib (7.1.1-3) ... Setting up media-types (14.0.0) ... Setting up libpipeline1:arm64 (1.5.8-1) ... Setting up bsdextrautils (2.41.2-4) ... Setting up libmagic-mgc (1:5.46-5) ... Setting up libarchive-zip-perl (1.68-1) ... Setting up libxml2-16:arm64 (2.14.6+dfsg-0.1) ... Setting up libdebhelper-perl (13.28) ... Setting up libmagic1t64:arm64 (1:5.46-5) ... Setting up gettext-base (0.23.1-2+b1) ... Setting up m4 (1.4.20-2) ... Setting up file (1:5.46-5) ... Setting up libelf1t64:arm64 (0.193-3) ... Setting up tzdata (2025b-5) ... Current default time zone: 'Etc/UTC' Local time is now: Fri Oct 31 21:48:43 UTC 2025. Universal Time is now: Fri Oct 31 21:48:43 UTC 2025. Run 'dpkg-reconfigure tzdata' if you wish to change it. Setting up autotools-dev (20240727.1) ... Setting up libunistring5:arm64 (1.3-2) ... Setting up autopoint (0.23.1-2) ... Setting up autoconf (2.72-3.1) ... Setting up libffi8:arm64 (3.5.2-2) ... Setting up dwz (0.16-2) ... Setting up sensible-utils (0.0.26) ... Setting up libuchardet0:arm64 (0.0.8-2) ... Setting up netbase (6.5) ... Setting up readline-common (8.3-3) ... Setting up automake (1:1.18.1-2) ... update-alternatives: using /usr/bin/automake-1.18 to provide /usr/bin/automake (automake) in auto mode Setting up libfile-stripnondeterminism-perl (1.15.0-1) ... Setting up gettext (0.23.1-2+b1) ... Setting up libtool (2.5.4-7) ... Setting up intltool-debian (0.35.0+20060710.6) ... Setting up dh-autoreconf (21) ... Setting up libreadline8t64:arm64 (8.3-3) ... Setting up dh-strip-nondeterminism (1.15.0-1) ... Setting up groff-base (1.23.0-9) ... Setting up libpython3.13-stdlib:arm64 (3.13.9-1) ... Setting up libpython3-stdlib:arm64 (3.13.7-1) ... Setting up python3.13 (3.13.9-1) ... Setting up po-debconf (1.0.21+nmu1) ... Setting up python3 (3.13.7-1) ... Setting up python3-zipp (3.23.0-1) ... Setting up python3-autocommand (2.2.2-3) ... Setting up man-db (2.13.1-1) ... Not building database; man-db/auto-update is not 'true'. Setting up python3-pygments (2.18.0+dfsg-2) ... Setting up python3-packaging (25.0-1) ... Setting up python3-pyparsing (3.1.3-1) ... Setting up python3-typing-extensions (4.15.0-1) ... Setting up python3-pluggy (1.6.0-1) ... Setting up python3-rdflib (7.1.1-3) ... Setting up dh-python (6.20250414) ... Setting up python3-more-itertools (10.8.0-1) ... Setting up python3-iniconfig (2.1.0-1) ... Setting up python3-jaraco.functools (4.1.0-1) ... Setting up python3-jaraco.context (6.0.1-1) ... Setting up python3-pytest (8.4.2-1) ... Setting up python3-typeguard (4.4.4-1) ... Setting up python3-all (3.13.7-1) ... Setting up debhelper (13.28) ... Setting up python3-inflect (7.5.0-1) ... Setting up python3-jaraco.text (4.0.0-1) ... Setting up python3-pkg-resources (78.1.1-0.1) ... Setting up python3-setuptools (78.1.1-0.1) ... Processing triggers for libc-bin (2.41-12) ... Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... Building tag database... -> Finished parsing the build-deps I: Building the package I: Running cd /build/reproducible-path/sparql-wrapper-python-2.0.0/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-genchanges -S > ../sparql-wrapper-python_2.0.0-2_source.changes dpkg-buildpackage: info: source package sparql-wrapper-python dpkg-buildpackage: info: source version 2.0.0-2 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Alexandre Detiste dpkg-source --before-build . dpkg-buildpackage: info: host architecture arm64 debian/rules clean dh clean --buildsystem=pybuild dh_auto_clean -O--buildsystem=pybuild I: pybuild base:311: python3.13 setup.py clean /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running clean removing '/build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build' (and everything under it) 'build/bdist.linux-aarch64' does not exist -- can't clean it 'build/scripts-3.13' does not exist -- can't clean it dh_autoreconf_clean -O--buildsystem=pybuild dh_clean -O--buildsystem=pybuild debian/rules binary dh binary --buildsystem=pybuild dh_update_autotools_config -O--buildsystem=pybuild dh_autoreconf -O--buildsystem=pybuild dh_auto_configure -O--buildsystem=pybuild I: pybuild base:311: python3.13 setup.py config /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running config dh_auto_build -O--buildsystem=pybuild I: pybuild base:311: /usr/bin/python3 setup.py build /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running build running build_py creating /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper I: pybuild pybuild:334: cp -r test /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build debian/rules override_dh_auto_test make[1]: Entering directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' # tests need a remote server dh_auto_test || : I: pybuild base:311: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build; python3.13 -m pytest test ============================= test session starts ============================== platform linux -- Python 3.13.9, pytest-8.4.2, pluggy-1.6.0 rootdir: /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build configfile: pyproject.toml plugins: typeguard-4.4.4 collected 1525 items test/test_agrovoc-allegrograph_on_hold.py sFxxsFFsFFxsFFxxsFFFFxxsFFFFxx [ 1%] sFFFFxxsFFFFFFFFssFFFxxFFxFFxxFFF [ 4%] test/test_allegrograph__v4_14_1__mmi.py ssFFFFFFssFFFFssFFFFFFssFFFFFFss [ 6%] FFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFFFssFFFFFF [ 10%] FFFFFFFFFFFFFFFFFFFFFFF [ 12%] test/test_blazegraph__wikidata.py ssFFFFFFssFFFFssFFFFFFssFFFFFFsFsFsFFF [ 14%] sFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFFsFsFFFFFFFsFF [ 19%] FFFsFFFFFFFsFFFFF [ 20%] test/test_cli.py ..F...FFFFFFFFFFFFFFFFFFFFFF [ 22%] test/test_fuseki2__v3_6_0__agrovoc.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFF [ 24%] sFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFs [ 29%] FFFFFFFFFFsFFsFFFFFFF [ 30%] test/test_fuseki2__v3_8_0__stw.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFFsFsF [ 33%] sFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFsFFFF [ 38%] FFFFFFsFFsFFFFFFF [ 39%] test/test_graphdbEnterprise__v8_9_0__rs.py ssssFFsFsssFsFssssFFsFsssFsFs [ 41%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFsFFsFsF [ 45%] ssFFsFsFsFsFsFssFFsFsFsFsF [ 47%] test/test_lov-fuseki_on_hold.py FFFFFFFFFFFFFFssssssssssssssFFFFFFFFFFFF [ 50%] FFFFssssssssssssssssFFFFFFFFFFFFFFFFssssssssssssssssFsFFssFFFFFFFFFFFFFF [ 54%] Fssssssssssssss [ 55%] test/test_rdf4j__geosciml.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 58%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFssFFsFsFssFFsFsFsFsFs [ 63%] FssFFsFsFsFsF [ 64%] test/test_stardog__lindas.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 67%] FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFssFFFsFsFssFFsFsFsFsFs [ 71%] FssFFsFsFsFsF [ 72%] test/test_store__v1_1_4.py FFFsFFsFsFxFxFxxxxxxxxxxxxxxsFsssFsFsFsFxFxFx [ 75%] xssxxxxxxxxxxxxsFsssFsssFssxFxFxxssxxxxxxxxxxxxFFFFssFFFFsFFsFsFxFxFxxxx [ 80%] xxxxxxxxxx [ 81%] test/test_virtuoso__v7_20_3230__dbpedia.py FFFssFssFFFFFFsssssFsssssssss [ 82%] FFFssFFFFFFFFFFsFssssFssssssFsssFFFssFFFFFFFFFFssssssssssssssssFFFFssFFF [ 87%] FFFFssFFFFFFsssFFsssssssss [ 89%] test/test_virtuoso__v8_03_3313__dbpedia.py FFFssFssFFFFFFsssssssssssssss [ 91%] FFFssFFFFFFFFFFsssssssssssssssssFFFssFFFFFFFFFFssssssssssssssssFFFFFsFFF [ 96%] FFFFssFFFFFFssssssssssssss [ 97%] test/test_wrapper.py ....s..........................F... [100%] =================================== FAILURES =================================== ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:345: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinJSONLD ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD(self): > result = self.__generic(askQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:513: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:499: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:485: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:520: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:601: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:643: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:629: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:724: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:615: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3(self): > result = self.__generic(describeQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:650: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:636: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:732: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_agrovoc-allegrograph_on_hold.py:757: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_agrovoc-allegrograph_on_hold.py:742: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:748: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:745: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:769: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:239: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_agrovoc-allegrograph_on_hold.py:224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_agrovoc-allegrograph_on_hold.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:572: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:647: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:658: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:579: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:603: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:614: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:689: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:698: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:468: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:586: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:669: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:680: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected(self): > result = self.__generic(askQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:625: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:636: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:707: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:716: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:476: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:484: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:885: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:894: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:921: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:930: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:823: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:830: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:760: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:768: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:956: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:964: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:738: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:903: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:939: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:948: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:837: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:844: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:784: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:972: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:980: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:745: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:752: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1158: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1086: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1093: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1023: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1031: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:994: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1001: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1176: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1203: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3(self): > result = self.__generic(describeQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1039: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1047: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1008: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1015: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_allegrograph__v4_14_1__mmi.py:1269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_allegrograph__v4_14_1__mmi.py:1254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:1281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:252: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:301: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:376: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:387: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:343: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:273: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:280: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:418: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:427: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:213: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:259: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:398: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:322: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:354: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:294: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:436: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:445: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:229: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_allegrograph__v4_14_1__mmi.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_allegrograph__v4_14_1__mmi.py:189: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:580: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:655: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:666: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:622: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:706: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:484: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:594: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:677: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:688: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:601: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected(self): > result = self.__generic(askQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:644: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:715: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:724: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:508: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:915: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:924: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:887: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:962: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:849: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:768: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:811: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:990: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:998: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:739: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:746: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:933: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:942: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:906: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:982: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:868: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:784: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:792: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:830: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1006: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1014: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:753: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:760: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1210: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1138: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1057: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1065: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1276: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1028: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1035: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1219: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1157: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1073: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1081: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1049: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_blazegraph__wikidata.py:1328: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_blazegraph__wikidata.py:1310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:1341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:325: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:400: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:367: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:301: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:442: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:339: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:422: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:346: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:389: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:318: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:460: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_blazegraph__wikidata.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_blazegraph__wikidata.py:201: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLIParser_Test.testInvalidFormat _________________ self = def testInvalidFormat(self): with self.assertRaises(SystemExit) as cm: parse_args(["-Q", testquery, "-F", "jjssoonn"]) self.assertEqual(cm.exception.code, 2) > self.assertEqual( sys.stderr.getvalue().split("\n")[1], "rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld')", ) E AssertionError: "rqw:[65 chars]from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld)" != "rqw:[65 chars]from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rd[28 chars]ld')" E - rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld) E + rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld') E ? + + + + + + + + + + + + + + + + + + test/test_cli.py:79: AssertionError ______________________ SPARQLWrapperCLI_Test.testQueryRDF ______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryRDF(self): > main(["-Q", "DESCRIBE ", "-e", endpoint, "-F", "rdf"]) test/test_cli.py:249: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperCLI_Test.testQueryTo4store ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryTo4store(self): > main(["-e", "http://rdf.chise.org/sparql", "-Q", testquery]) test/test_cli.py:627: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperCLI_Test.testQueryToAgrovoc_AllegroGraph _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToAgrovoc_AllegroGraph(self): > main(["-e", "https://agrovoc.fao.org/sparql", "-Q", testquery]) test/test_cli.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLI_Test.testQueryToAllegroGraph _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToAllegroGraph(self): > main(["-e", "https://mmisw.org/sparql", "-Q", testquery]) test/test_cli.py:378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToBrazeGraph __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToBrazeGraph(self): > main(["-e", "https://query.wikidata.org/sparql", "-Q", testquery]) test/test_cli.py:546: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_6 _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToFuseki2V3_6(self): > main(["-e", "https://agrovoc.uniroma2.it/sparql/", "-Q", testquery]) test/test_cli.py:573: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_8 _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToFuseki2V3_8(self): > main(["-e", "http://zbw.eu/beta/sparql/stw/query", "-Q", testquery]) test/test_cli.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperCLI_Test.testQueryToGraphDBEnterprise ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToGraphDBEnterprise(self): > main(["-e", "http://factforge.net/repositories/ff-news", "-Q", testquery]) test/test_cli.py:405: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryToLovFuseki __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToLovFuseki(self): > main(["-e", "https://lov.linkeddata.es/dataset/lov/sparql/", "-Q", testquery]) test/test_cli.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperCLI_Test.testQueryToRDF4J ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToRDF4J(self): > main( [ "-e", "http://vocabs.ands.org.au/repository/api/sparql/csiro_international-chronostratigraphic-chart_2018-revised-corrected", "-Q", testquery, ] ) test/test_cli.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperCLI_Test.testQueryToStardog ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToStardog(self): > main(["-e", "https://lindas.admin.ch/query", "-Q", testquery, "-m", POST]) test/test_cli.py:432: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV7 __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToVirtuosoV7(self): > main(["-e", "http://dbpedia.org/sparql", "-Q", testquery]) test/test_cli.py:516: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV8 __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryToVirtuosoV8(self): > main(["-e", "http://dbpedia-live.openlinksw.com/sparql", "-Q", testquery]) test/test_cli.py:486: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperCLI_Test.testQueryWithEndpoint __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithEndpoint(self): > main( [ "-Q", testquery, "-e", endpoint, ] ) test/test_cli.py:97: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperCLI_Test.testQueryWithFile ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFile(self): > main(["-f", testfile, "-e", endpoint]) test/test_cli.py:135: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileCSV __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileCSV(self): > main(["-f", testfile, "-e", endpoint, "-F", "csv"]) test/test_cli.py:291: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileN3 ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileN3(self): > main(["-f", testfile, "-e", endpoint, "-F", "n3"]) test/test_cli.py:232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLI_Test.testQueryWithFileRDFXML _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileRDFXML(self): > main(["-f", testfile, "-e", endpoint, "-F", "rdf+xml"]) test/test_cli.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileTSV __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileTSV(self): > main(["-f", testfile, "-e", endpoint, "-F", "tsv"]) test/test_cli.py:304: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperCLI_Test.testQueryWithFileTurtle _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileTurtle(self): > main(["-f", testfile, "-e", endpoint, "-F", "turtle"]) test/test_cli.py:188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperCLI_Test.testQueryWithFileTurtleQuiet ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileTurtleQuiet(self): > main( [ "-f", testfile, "-e", endpoint, "-F", "turtle", "-q", ] ) test/test_cli.py:205: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperCLI_Test.testQueryWithFileXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithFileXML(self): > main(["-f", testfile, "-e", endpoint, "-F", "xml"]) test/test_cli.py:167: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/main.py:137: in main results = sparql.query().convert() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:489: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:496: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:545: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:629: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:552: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:587: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:517: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:524: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:658: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:667: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:457: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:465: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV(self): > result = self.__generic(askQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:503: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:510: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:559: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:650: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:566: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:608: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV(self): > result = self.__generic(askQuery, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:531: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:538: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:676: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:685: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:473: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:481: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:874: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:831: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:839: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:901: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:910: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:806: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:738: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:772: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:935: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:943: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:700: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:707: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:893: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:847: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:855: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:918: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:927: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:823: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:755: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:951: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:959: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:714: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:721: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1143: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1110: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1170: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1179: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1079: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1011: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1045: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1212: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:973: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:980: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1196: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1096: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1028: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1062: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1228: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:987: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:994: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_fuseki2__v3_6_0__agrovoc.py:1253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_fuseki2__v3_6_0__agrovoc.py:1238: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1241: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:1266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:302: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:309: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:415: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:424: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:260: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:267: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:316: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:288: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:295: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:442: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:230: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_6_0__agrovoc.py:238: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:493: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:549: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:633: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:591: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:521: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:528: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:662: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:671: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:461: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:469: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV(self): > result = self.__generic(askQuery, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:507: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:514: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON(self): > result = self.__generic(askQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:563: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:654: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:570: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:612: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV(self): > result = self.__generic(askQuery, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:535: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:542: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow(self): > result = self.__generic(askQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:680: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:689: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML(self): > result = self.__generic(askQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:477: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:485: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:878: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:835: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:843: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:905: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:914: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:810: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:742: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:776: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:939: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:947: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:704: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:711: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:897: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:851: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:859: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:922: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:931: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:827: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:759: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:793: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow(self): > result = self.__generic(constructQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:955: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:963: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML(self): > result = self.__generic(constructQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:718: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:725: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1107: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1114: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1174: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1183: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1083: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1015: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1049: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1208: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1216: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:977: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:984: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1121: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1128: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1191: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1200: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1032: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1066: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow(self): > result = self.__generic(describeQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1224: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1232: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML(self): > result = self.__generic(describeQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:991: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:998: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_fuseki2__v3_8_0__stw.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_fuseki2__v3_8_0__stw.py:1242: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1245: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1261: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:1270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:250: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:306: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:313: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:348: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:419: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:226: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:264: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON(self): > result = self.__generic(selectQuery, JSON, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:320: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:327: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:369: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:292: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:299: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow(self): > result = self.__generic(selectQuery, "bar", POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:437: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:446: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML(self): > result = self.__generic(selectQuery, XML, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:234: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_fuseki2__v3_8_0__stw.py:242: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_fuseki2__v3_8_0__stw.py:194: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:663: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:585: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:621: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:702: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:488: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:684: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:600: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:642: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:721: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:505: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:898: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:864: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:936: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:834: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:774: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:804: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:972: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:744: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:917: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:879: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:955: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:849: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:789: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:819: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:989: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:759: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1166: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1132: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1204: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1102: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1072: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1012: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1185: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1147: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1223: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1057: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1087: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1027: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_graphdbEnterprise__v8_9_0__rs.py:1286: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_graphdbEnterprise__v8_9_0__rs.py:1268: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1290: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:1298: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:269: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:329: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:299: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:445: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:236: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:284: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:344: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:314: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:464: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_graphdbEnterprise__v8_9_0__rs.py:253: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:536: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:543: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:604: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected(self): > result = self.__generic(askQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:687: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:611: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected(self): > result = self.__generic(askQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:641: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:570: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:577: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:740: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:498: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:506: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected(self): > result = self.__generic(constructQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:967: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:976: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:928: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:936: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected(self): > result = self.__generic(constructQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1010: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1019: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:890: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:898: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:814: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:822: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:852: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:860: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1052: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1060: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:779: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:786: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected(self): > result = self.__generic(describeQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1280: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1244: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1251: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected(self): > result = self.__generic(describeQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1332: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1207: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1215: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1131: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1139: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1169: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1177: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1365: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1373: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1096: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1103: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_lov-fuseki_on_hold.py:1423: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1414: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1411: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:1443: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:262: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:323: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:416: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected(self): > result = self.__generic(selectQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:360: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:370: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:450: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_lov-fuseki_on_hold.py:225: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_lov-fuseki_on_hold.py:193: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:674: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:591: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:628: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:716: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:697: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:606: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:651: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:735: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:511: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:912: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:878: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:950: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:848: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:788: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:818: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:986: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:758: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:931: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:893: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:969: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:863: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:803: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1003: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:773: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1180: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1146: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1218: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1116: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1056: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1086: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1025: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1199: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1161: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1131: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1071: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1101: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1271: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1041: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_rdf4j__geosciml.py:1305: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryBadFormed_1 ____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed_1(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed_1, XML, GET) test/test_rdf4j__geosciml.py:1282: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1289: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1309: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:1317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:409: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:363: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:296: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:451: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:233: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:281: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:432: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:341: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:386: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:470: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_rdf4j__geosciml.py:250: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_rdf4j__geosciml.py:200: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:678: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:595: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:632: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:720: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:498: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:701: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:610: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:655: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinUnknow_Conneg(self): > result = self.__generic(askQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:739: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinXML_Conneg(self): > result = self.__generic(askQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:515: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:916: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:882: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:954: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:852: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:792: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:822: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:990: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:762: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:935: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:897: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:973: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3_Conneg(self): > result = self.__generic(constructQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:867: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:807: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:837: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1007: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinXML_Conneg(self): > result = self.__generic(constructQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:777: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1183: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1149: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1119: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1059: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1089: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1257: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1029: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1202: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1164: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1240: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinN3_Conneg(self): > result = self.__generic(describeQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1134: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1074: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1104: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinUnknow_Conneg(self): > result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1274: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByPOSTinXML_Conneg(self): > result = self.__generic(describeQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1044: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_stardog__lindas.py:1307: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1298: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1293: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1311: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:270: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:330: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:367: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:300: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:455: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:237: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:436: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:345: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:390: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinUnknow_Conneg(self): > result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:474: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinXML_Conneg(self): > result = self.__generic(selectQuery, XML, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_stardog__lindas.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_stardog__lindas.py:204: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:520: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:527: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:583: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:673: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:590: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinN3_Unexpected_Conneg(self): > result = self.__generic(askQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:627: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:560: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:718: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:942: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:981: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:872: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:797: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:834: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1020: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:763: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1246: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSON_Unexpected_Conneg(self): > result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1097: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1326: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1062: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_store__v1_1_4.py:1371: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_store__v1_1_4.py:1356: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1362: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1359: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:1387: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:247: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:254: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:310: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:403: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:317: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinN3_Unexpected_Conneg(self): > result = self.__generic(selectQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:357: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:287: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:448: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_store__v1_1_4.py:221: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_store__v1_1_4.py:188: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1348: in http_open return self.do_open(http.client.HTTPConnection, req) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = , http_conn_args = {} host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:526: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:533: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:586: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:593: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:556: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:563: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:728: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:737: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:492: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:500: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByPOSTinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:608: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:950: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:895: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:904: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:866: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:873: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:802: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:809: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:833: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:841: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1048: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1056: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:772: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:779: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:977: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinN3(self): > result = self.__generic(constructQuery, N3, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:880: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByPOSTinUnknow_Conneg(self): > result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1073: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1266: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1211: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1220: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1181: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1188: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1117: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1124: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1148: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1156: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1364: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1372: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1087: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1094: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_virtuoso__v7_20_3230__dbpedia.py:1416: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_virtuoso__v7_20_3230__dbpedia.py:1401: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1404: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:1428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:448: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:457: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, POST) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:428: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v7_20_3230__dbpedia.py:439: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV(self): > result = self.__generic(askQuery, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:528: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinCSV_Conneg(self): > result = self.__generic(askQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:535: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON(self): > result = self.__generic(askQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:588: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinJSON_Conneg(self): > result = self.__generic(askQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:595: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV(self): > result = self.__generic(askQuery, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:558: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinTSV_Conneg(self): > result = self.__generic(askQuery, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:565: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow(self): > result = self.__generic(askQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:731: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinUnknow_Conneg(self): > result = self.__generic(askQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:740: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML(self): > result = self.__generic(askQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:494: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testAskByGETinXML_Conneg(self): > result = self.__generic(askQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:502: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:954: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD(self): > result = self.__generic(constructQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:899: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinJSONLD_Conneg(self): > result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:908: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3(self): > result = self.__generic(constructQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:869: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinN3_Conneg(self): > result = self.__generic(constructQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:876: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML(self): > result = self.__generic(constructQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:805: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinRDFXML_Conneg(self): > result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:812: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE(self): > result = self.__generic(constructQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:836: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinTURTLE_Conneg(self): > result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:844: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow(self): > result = self.__generic(constructQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1053: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinUnknow_Conneg(self): > result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1061: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testConstructByGETinXML __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML(self): > result = self.__generic(constructQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:775: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testConstructByGETinXML_Conneg(self): > result = self.__generic(constructQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:782: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinCSV_Unexpected_Conneg(self): > result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1272: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD(self): > result = self.__generic(describeQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1217: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinJSONLD_Conneg(self): > result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1226: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3(self): > result = self.__generic(describeQuery, N3, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1187: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinN3_Conneg(self): > result = self.__generic(describeQuery, N3, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1194: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML(self): > result = self.__generic(describeQuery, RDFXML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1123: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinRDFXML_Conneg(self): > result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1130: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE(self): > result = self.__generic(describeQuery, TURTLE, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1154: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinTURTLE_Conneg(self): > result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1162: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow(self): > result = self.__generic(describeQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1370: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinUnknow_Conneg(self): > result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1378: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML(self): > result = self.__generic(describeQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1093: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testDescribeByGETinXML_Conneg(self): > result = self.__generic(describeQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1100: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________________ SPARQLWrapperTests.testKeepAlive _______________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testKeepAlive(self): sparql = SPARQLWrapper(endpoint) sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") sparql.setReturnFormat(JSON) sparql.setMethod(GET) sparql.setUseKeepAlive() > sparql.query() test/test_virtuoso__v8_03_3313__dbpedia.py:1422: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryBadFormed(self): > self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) test/test_virtuoso__v8_03_3313__dbpedia.py:1407: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryDuplicatedPrefix(self): > result = self.__generic(queryDuplicatedPrefix, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1413: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryManyPrefixes(self): > result = self.__generic(queryManyPrefixes, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1410: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_1(self): > result = self.__generic(queryWithCommaInCurie_1, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1426: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testQueryWithComma_3(self): > result = self.__generic(queryWithCommaInUri, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:1433: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:248: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinCSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:255: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON(self): > result = self.__generic(selectQuery, JSON, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:308: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected(self): > result = self.__generic(selectQuery, JSONLD, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:406: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSONLD_Unexpected_Conneg(self): > result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:417: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinJSON_Conneg(self): > result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:315: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:278: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinTSV_Conneg(self): > result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:285: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError __________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow(self): > result = self.__generic(selectQuery, "foo", GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:450: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinUnknow_Conneg(self): > result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:459: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML(self): > result = self.__generic(selectQuery, XML, GET) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:214: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError ________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: > h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) /usr/lib/python3.13/urllib/request.py:1319: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /usr/lib/python3.13/http/client.py:1338: in request self._send_request(method, url, body, headers, encode_chunked) /usr/lib/python3.13/http/client.py:1384: in _send_request self.endheaders(body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1333: in endheaders self._send_output(message_body, encode_chunked=encode_chunked) /usr/lib/python3.13/http/client.py:1093: in _send_output self.send(msg) /usr/lib/python3.13/http/client.py:1037: in send self.connect() /usr/lib/python3.13/http/client.py:1472: in connect super().connect() /usr/lib/python3.13/http/client.py:1003: in connect self.sock = self._create_connection( /usr/lib/python3.13/socket.py:864: in create_connection raise exceptions[0] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ address = ('127.0.0.1', 9), timeout = source_address = None def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, source_address=None, *, all_errors=False): """Connect to *address* and return the socket object. Convenience function. Connect to *address* (a 2-tuple ``(host, port)``) and return the socket object. Passing the optional *timeout* parameter will set the timeout on the socket instance before attempting to connect. If no *timeout* is supplied, the global default timeout setting returned by :func:`getdefaulttimeout` is used. If *source_address* is set it must be a tuple of (host, port) for the socket to bind as a source address before making the connection. A host of '' or port 0 tells the OS to use the default. When a connection cannot be created, raises the last error if *all_errors* is False, and an ExceptionGroup of all errors if *all_errors* is True. """ host, port = address exceptions = [] for res in getaddrinfo(host, port, 0, SOCK_STREAM): af, socktype, proto, canonname, sa = res sock = None try: sock = socket(af, socktype, proto) if timeout is not _GLOBAL_DEFAULT_TIMEOUT: sock.settimeout(timeout) if source_address: sock.bind(source_address) > sock.connect(sa) E ConnectionRefusedError: [Errno 111] Connection refused /usr/lib/python3.13/socket.py:849: ConnectionRefusedError During handling of the above exception, another exception occurred: self = def testSelectByGETinXML_Conneg(self): > result = self.__generic(selectQuery, XML, GET, onlyConneg=True) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ test/test_virtuoso__v8_03_3313__dbpedia.py:222: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic result = sparql.query() ^^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:960: in query return QueryResult(self._query()) ^^^^^^^^^^^^^ SPARQLWrapper/Wrapper.py:926: in _query response = urlopener(request) ^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:189: in urlopen return opener.open(url, data, timeout) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:489: in open response = self._open(req, data) ^^^^^^^^^^^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:506: in _open result = self._call_chain(self.handle_open, protocol, protocol + /usr/lib/python3.13/urllib/request.py:466: in _call_chain result = func(*args) ^^^^^^^^^^^ /usr/lib/python3.13/urllib/request.py:1367: in https_open return self.do_open(http.client.HTTPSConnection, req, _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = http_class = req = http_conn_args = {'context': } host = '127.0.0.1:9', h = headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} def do_open(self, http_class, req, **http_conn_args): """Return an HTTPResponse object for the request, using http_class. http_class must implement the HTTPConnection API from http.client. """ host = req.host if not host: raise URLError('no host given') # will parse host:port h = http_class(host, timeout=req.timeout, **http_conn_args) h.set_debuglevel(self._debuglevel) headers = dict(req.unredirected_hdrs) headers.update({k: v for k, v in req.headers.items() if k not in headers}) # TODO(jhylton): Should this be redesigned to handle # persistent connections? # We want to make an HTTP/1.1 request, but the addinfourl # class isn't prepared to deal with a persistent connection. # It will try to read all remaining data from the socket, # which will block while the server waits for the next request. # So make sure the connection gets closed after the (only) # request. headers["Connection"] = "close" headers = {name.title(): val for name, val in headers.items()} if req._tunnel_host: tunnel_headers = {} proxy_auth_hdr = "Proxy-Authorization" if proxy_auth_hdr in headers: tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] # Proxy-Authorization should not be sent to origin # server. del headers[proxy_auth_hdr] h.set_tunnel(req._tunnel_host, headers=tunnel_headers) try: try: h.request(req.get_method(), req.selector, req.data, headers, encode_chunked=req.has_header('Transfer-encoding')) except OSError as err: # timeout error > raise URLError(err) E urllib.error.URLError: /usr/lib/python3.13/urllib/request.py:1322: URLError _________________________ QueryResult_Test.testConvert _________________________ self = def testConvert(self): class FakeResponse(object): def __init__(self, content_type): self.content_type = content_type def info(self): return {"content-type": self.content_type} def read(self, len): return "" def _mime_vs_type(mime, requested_type): """ :param mime: mimetype/Content-Type of the response :param requested_type: requested mimetype (alias) :return: number of warnings produced by combo """ with warnings.catch_warnings(record=True) as w: qr = QueryResult((FakeResponse(mime), requested_type)) try: qr.convert() except: pass # if len(w) > 0: print(w[0].message) # FOR DEBUG # if len(w) > 1: print(w[1].message) # FOR DEBUG return len(w) # In the cases of "application/ld+json" and "application/rdf+xml", the # RDFLib raised a warning because the manually created QueryResult has no real # response value (implemented a fake read). # "WARNING:rdflib.term: does not look like a valid URI, trying to serialize this will break." self.assertEqual(0, _mime_vs_type("application/sparql-results+xml", XML)) self.assertEqual(0, _mime_vs_type("application/sparql-results+json", JSON)) self.assertEqual(0, _mime_vs_type("text/n3", N3)) self.assertEqual(0, _mime_vs_type("text/turtle", TURTLE)) self.assertEqual(0, _mime_vs_type("application/turtle", TURTLE)) self.assertEqual(0, _mime_vs_type("application/json", JSON)) > self.assertEqual(0, _mime_vs_type("application/ld+json", JSONLD)) E AssertionError: 0 != 1 test/test_wrapper.py:876: AssertionError =============================== warnings summary =============================== test/test_agrovoc-allegrograph_on_hold.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_agrovoc-allegrograph_on_hold.py:167: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_allegrograph__v4_14_1__mmi.py:166 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_allegrograph__v4_14_1__mmi.py:166: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_blazegraph__wikidata.py:175 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_blazegraph__wikidata.py:175: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_fuseki2__v3_6_0__agrovoc.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_fuseki2__v3_6_0__agrovoc.py:167: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_fuseki2__v3_8_0__stw.py:168 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_fuseki2__v3_8_0__stw.py:168: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_graphdbEnterprise__v8_9_0__rs.py:179 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_graphdbEnterprise__v8_9_0__rs.py:179: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_lov-fuseki_on_hold.py:170 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_lov-fuseki_on_hold.py:170: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_rdf4j__geosciml.py:176 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_rdf4j__geosciml.py:176: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_stardog__lindas.py:180 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_stardog__lindas.py:180: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_store__v1_1_4.py:165 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_store__v1_1_4.py:165: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_virtuoso__v7_20_3230__dbpedia.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_virtuoso__v7_20_3230__dbpedia.py:167: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_virtuoso__v8_03_3313__dbpedia.py:167 /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_virtuoso__v8_03_3313__dbpedia.py:167: SyntaxWarning: invalid escape sequence '\:' ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'ASK' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'ASK' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 4 warnings test/test_allegrograph__v4_14_1__mmi.py: 8 warnings test/test_blazegraph__wikidata.py: 8 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings test/test_fuseki2__v3_8_0__stw.py: 8 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings test/test_lov-fuseki_on_hold.py: 8 warnings test/test_rdf4j__geosciml.py: 4 warnings test/test_stardog__lindas.py: 4 warnings test/test_store__v1_1_4.py: 8 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 8 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 8 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'foo'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 4 warnings test/test_allegrograph__v4_14_1__mmi.py: 8 warnings test/test_blazegraph__wikidata.py: 8 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings test/test_fuseki2__v3_8_0__stw.py: 8 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings test/test_rdf4j__geosciml.py: 4 warnings test/test_stardog__lindas.py: 4 warnings test/test_store__v1_1_4.py: 8 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'bar'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 2 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'CONSTRUCT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 2 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings test/test_fuseki2__v3_8_0__stw.py: 4 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'CONSTRUCT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 2 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings test/test_fuseki2__v3_8_0__stw.py: 4 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 1 warning test/test_allegrograph__v4_14_1__mmi.py: 1 warning test/test_blazegraph__wikidata.py: 1 warning test/test_fuseki2__v3_6_0__agrovoc.py: 1 warning test/test_fuseki2__v3_8_0__stw.py: 1 warning test/test_graphdbEnterprise__v8_9_0__rs.py: 1 warning test/test_lov-fuseki_on_hold.py: 1 warning test/test_rdf4j__geosciml.py: 1 warning test/test_stardog__lindas.py: 1 warning test/test_store__v1_1_4.py: 1 warning test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:663: UserWarning: keepalive support not available, so the execution of this method has no effect warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings test/test_virtuoso__v7_20_3230__dbpedia.py: 4 warnings test/test_virtuoso__v8_03_3313__dbpedia.py: 2 warnings test/test_wrapper.py: 1 warning /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'SELECT' SPARQL query form warnings.warn( test/test_agrovoc-allegrograph_on_hold.py: 2 warnings test/test_allegrograph__v4_14_1__mmi.py: 4 warnings test/test_blazegraph__wikidata.py: 4 warnings test/test_cli.py: 1 warning test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings test/test_fuseki2__v3_8_0__stw.py: 2 warnings test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings test/test_lov-fuseki_on_hold.py: 2 warnings test/test_rdf4j__geosciml.py: 2 warnings test/test_stardog__lindas.py: 2 warnings test/test_store__v1_1_4.py: 3 warnings /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'SELECT' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf' in a 'DESCRIBE' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf+xml' in a 'SELECT' SPARQL query form warnings.warn( test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'turtle' in a 'SELECT' SPARQL query form warnings.warn( -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html =========================== short test summary info ============================ FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSONLD FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinN3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_cli.py::SPARQLWrapperCLIParser_Test::testInvalidFormat - Ass... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF - urllib.error.U... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryTo4store - urllib.er... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAgrovoc_AllegroGraph FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAllegroGraph - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToBrazeGraph - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_6 - urll... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_8 - urll... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToGraphDBEnterprise FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToLovFuseki - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToRDF4J - urllib.err... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToStardog - urllib.e... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV7 - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV8 - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithEndpoint - urlli... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFile - urllib.er... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileCSV - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileN3 - urllib.... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTSV - urllib... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle - url... FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtleQuiet FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileXML - urllib... FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testKeepAlive - u... FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testKeepAlive - urll... FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryBadFormed_1 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testKeepAlive - urll... FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV - ur... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON - u... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testKeepAlive - urllib... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryBadFormed - u... FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testKeepAlive FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_1 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg FAILED test/test_wrapper.py::QueryResult_Test::testConvert - AssertionError: ... = 858 failed, 38 passed, 549 skipped, 80 xfailed, 381 warnings in 760.92s (0:12:40) = E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build; python3.13 -m pytest test dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.13 returned exit code 13 make[1]: Leaving directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' create-stamp debian/debhelper-build-stamp dh_testroot -O--buildsystem=pybuild dh_prep -O--buildsystem=pybuild dh_auto_install --destdir=debian/python3-sparqlwrapper/ -O--buildsystem=pybuild I: pybuild pybuild:308: rm -fr /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test I: pybuild base:311: /usr/bin/python3 setup.py install --root /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running install /usr/lib/python3/dist-packages/setuptools/_distutils/cmd.py:90: SetuptoolsDeprecationWarning: setup.py install is deprecated. !! ******************************************************************************** Please avoid running ``setup.py`` directly. Instead, use pypa/build, pypa/installer or other standards-based tools. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. ******************************************************************************** !! self.initialize_options() running build running build_py running install_lib creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/v/cache/lastfailed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/v/cache/nodeids -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/CACHEDIR.TAG -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/.gitignore -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/README.md -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/main.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/sparql_dataframe.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/SPARQLExceptions.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/KeyCaseInsensitiveDict.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/Wrapper.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/SmartWrapper.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/__init__.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/KeyCaseInsensitiveDict.py to KeyCaseInsensitiveDict.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/SPARQLExceptions.py to SPARQLExceptions.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/SmartWrapper.py to SmartWrapper.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/Wrapper.py to Wrapper.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__init__.py to __init__.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/main.py to main.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/sparql_dataframe.py to sparql_dataframe.cpython-313.pyc running install_egg_info running egg_info creating SPARQLWrapper.egg-info writing SPARQLWrapper.egg-info/PKG-INFO writing dependency_links to SPARQLWrapper.egg-info/dependency_links.txt writing entry points to SPARQLWrapper.egg-info/entry_points.txt writing requirements to SPARQLWrapper.egg-info/requires.txt writing top-level names to SPARQLWrapper.egg-info/top_level.txt writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files found matching 'Makefile' warning: no directories found matching 'docs/build/html' adding license file 'LICENSE.txt' adding license file 'AUTHORS.md' writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' Copying SPARQLWrapper.egg-info to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper-2.0.0.egg-info Skipping SOURCES.txt running install_scripts Installing rqw script to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/bin dh_installdocs -O--buildsystem=pybuild dh_installchangelogs -O--buildsystem=pybuild dh_installexamples -O--buildsystem=pybuild dh_python3 -O--buildsystem=pybuild I: dh_python3 tools:114: replacing shebang in debian/python3-sparqlwrapper/usr/bin/rqw dh_installsystemduser -O--buildsystem=pybuild dh_perl -O--buildsystem=pybuild dh_link -O--buildsystem=pybuild dh_strip_nondeterminism -O--buildsystem=pybuild dh_compress -O--buildsystem=pybuild dh_fixperms -O--buildsystem=pybuild dh_missing -O--buildsystem=pybuild dh_installdeb -O--buildsystem=pybuild dh_gencontrol -O--buildsystem=pybuild dh_md5sums -O--buildsystem=pybuild dh_builddeb -O--buildsystem=pybuild dpkg-deb: building package 'python3-sparqlwrapper' in '../python3-sparqlwrapper_2.0.0-2_all.deb'. dpkg-genbuildinfo --build=binary -O../sparql-wrapper-python_2.0.0-2_arm64.buildinfo dpkg-genchanges --build=binary -O../sparql-wrapper-python_2.0.0-2_arm64.changes dpkg-genchanges: info: binary-only upload (no source code included) dpkg-source --after-build . dpkg-buildpackage: info: binary-only upload (no source included) dpkg-genchanges: info: not including original source code in upload I: copying local configuration I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env I: removing directory /srv/workspace/pbuilder/4132079 and its subdirectories I: Current time: Fri Oct 31 10:01:44 -12 2025 I: pbuilder-time-stamp: 1761948104 Fri Oct 31 22:01:46 UTC 2025 I: 1st build successful. Starting 2nd build on remote node codethink03-arm64.debian.net. Fri Oct 31 22:01:46 UTC 2025 I: Preparing to do remote build '2' on codethink03-arm64.debian.net. Fri Oct 31 22:01:46 UTC 2025 - checking /var/lib/jenkins/offline_nodes if codethink03-arm64.debian.net is marked as down. Fri Oct 31 22:01:46 UTC 2025 - checking via ssh if codethink03-arm64.debian.net is up. removed '/tmp/read-only-fs-test-sGnmql' ==================================================================================== Fri Dec 4 04:24:47 UTC 2026 - running /srv/jenkins/bin/reproducible_build.sh (for job /srv/jenkins/bin/reproducible_build.sh) on codethink03-arm64, called using "2 sparql-wrapper-python forky /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy 2.0.0-2" as arguments. Fri Dec 4 04:24:47 UTC 2026 - actually running "reproducible_build.sh" (md5sum bcb6fe1b50cf4e8eedacd0971a9eb63f) as "/tmp/jenkins-script-CqsiKTZG" $ git clone https://salsa.debian.org/qa/jenkins.debian.net.git ; more CONTRIBUTING Fri Dec 4 04:24:47 UTC 2026 I: Downloading source for forky/sparql-wrapper-python=2.0.0-2 Reading package lists... NOTICE: 'sparql-wrapper-python' packaging is maintained in the 'Git' version control system at: https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git Please use: git clone https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git to retrieve the latest (possibly unreleased) updates to the package. Need to get 140 kB of source archives. Get:1 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (dsc) [2214 B] Get:2 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (tar) [132 kB] Get:3 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (diff) [5692 B] Fetched 140 kB in 0s (0 B/s) Download complete and in download only mode Reading package lists... NOTICE: 'sparql-wrapper-python' packaging is maintained in the 'Git' version control system at: https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git Please use: git clone https://salsa.debian.org/python-team/packages/sparql-wrapper-python.git to retrieve the latest (possibly unreleased) updates to the package. Need to get 140 kB of source archives. Get:1 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (dsc) [2214 B] Get:2 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (tar) [132 kB] Get:3 http://deb.debian.org/debian forky/main sparql-wrapper-python 2.0.0-2 (diff) [5692 B] Fetched 140 kB in 0s (0 B/s) Download complete and in download only mode ============================================================================= Re-Building sparql-wrapper-python in forky on arm64 on codethink03-arm64 now. Date: Fri Dec 4 04:24:48 GMT 2026 Date UTC: Fri Dec 4 04:24:48 UTC 2026 ============================================================================= ++ mktemp -t pbuilderrc_XXXX --tmpdir=/srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy + local TMPCFG=/srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/pbuilderrc_Wxxh + case ${ARCH} in + case $ARCH in + locale=nl_BE + language=nl + case "${SUITE}" in + reproducible_buildflags=+all + extra_deb_build_options= + case "${SRCPACKAGE}" in + cat + echo BUILDDIR=/build/reproducible-path + '[' sparql-wrapper-python = debian-installer -o sparql-wrapper-python = debian-installer-netboot-images ']' + pbuilder_options=() + local pbuilder_options + DEBBUILDOPTS=-b + BINARYTARGET= + '[' sparql-wrapper-python = u-boot ']' + case "${SRCPACKAGE}" in + PBUILDERTIMEOUT=24 + local PRESULT=0 + sudo timeout -k 24.1h 24h /usr/bin/ionice -c 3 /usr/bin/nice -n 11 /usr/bin/unshare --uts -- /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/pbuilderrc_Wxxh --distribution forky --hookdir /etc/pbuilder/rebuild-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/forky-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b2 --logfile b2/build.log sparql-wrapper-python_2.0.0-2.dsc W: /root/.pbuilderrc does not exist I: Logging to b2/build.log I: pbuilder: network access will be disabled during build I: Current time: Fri Dec 4 18:24:48 +14 2026 I: pbuilder-time-stamp: 1796358288 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/forky-reproducible-base.tgz] I: copying local configuration W: --override-config is not set; not updating apt.conf Read the manpage for details. I: mounting /proc filesystem I: mounting /sys filesystem I: creating /{dev,run}/shm I: mounting /dev/pts filesystem I: redirecting /dev/ptmx to /dev/pts/ptmx I: policy-rc.d already exists I: Copying source file I: copying [sparql-wrapper-python_2.0.0-2.dsc] I: copying [./sparql-wrapper-python_2.0.0.orig.tar.gz] I: copying [./sparql-wrapper-python_2.0.0-2.debian.tar.xz] I: Extracting source dpkg-source: warning: cannot verify inline signature for ./sparql-wrapper-python_2.0.0-2.dsc: no acceptable signature found dpkg-source: info: extracting sparql-wrapper-python in sparql-wrapper-python-2.0.0 dpkg-source: info: unpacking sparql-wrapper-python_2.0.0.orig.tar.gz dpkg-source: info: unpacking sparql-wrapper-python_2.0.0-2.debian.tar.xz I: Not using root during the build. I: Installing the build-deps I: user script /srv/workspace/pbuilder/188717/tmp/hooks/D01_modify_environment starting debug: Running on codethink03-arm64. I: Changing host+domainname to test build reproducibility I: Adding a custom variable just for the fun of it... I: Changing /bin/sh to bash '/bin/sh' -> '/bin/bash' lrwxrwxrwx 1 root root 9 Dec 4 04:24 /bin/sh -> /bin/bash I: Setting pbuilder2's login shell to /bin/bash I: Setting pbuilder2's GECOS to second user,second room,second work-phone,second home-phone,second other I: user script /srv/workspace/pbuilder/188717/tmp/hooks/D01_modify_environment finished I: user script /srv/workspace/pbuilder/188717/tmp/hooks/D02_print_environment starting I: set BASH=/bin/sh BASHOPTS=checkwinsize:cmdhist:complete_fullquote:extquote:force_fignore:globasciiranges:globskipdots:hostcomplete:interactive_comments:patsub_replacement:progcomp:promptvars:sourcepath BASH_ALIASES=() BASH_ARGC=() BASH_ARGV=() BASH_CMDS=() BASH_LINENO=([0]="12" [1]="0") BASH_LOADABLES_PATH=/usr/local/lib/bash:/usr/lib/bash:/opt/local/lib/bash:/usr/pkg/lib/bash:/opt/pkg/lib/bash:. BASH_SOURCE=([0]="/tmp/hooks/D02_print_environment" [1]="/tmp/hooks/D02_print_environment") BASH_VERSINFO=([0]="5" [1]="3" [2]="3" [3]="1" [4]="release" [5]="aarch64-unknown-linux-gnu") BASH_VERSION='5.3.3(1)-release' BUILDDIR=/build/reproducible-path BUILDUSERGECOS='second user,second room,second work-phone,second home-phone,second other' BUILDUSERNAME=pbuilder2 BUILD_ARCH=arm64 DEBIAN_FRONTEND=noninteractive DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=12 nocheck' DIRSTACK=() DISTRIBUTION=forky EUID=0 FUNCNAME=([0]="Echo" [1]="main") GROUPS=() HOME=/root HOSTNAME=i-capture-the-hostname HOSTTYPE=aarch64 HOST_ARCH=arm64 IFS=' ' INVOCATION_ID=f2d1fa8e896c4d6d9efa9d9bdbcf9f6f LANG=C LANGUAGE=nl_BE:nl LC_ALL=C MACHTYPE=aarch64-unknown-linux-gnu MAIL=/var/mail/root OPTERR=1 OPTIND=1 OSTYPE=linux-gnu PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path PBCURRENTCOMMANDLINEOPERATION=build PBUILDER_OPERATION=build PBUILDER_PKGDATADIR=/usr/share/pbuilder PBUILDER_PKGLIBDIR=/usr/lib/pbuilder PBUILDER_SYSCONFDIR=/etc PIPESTATUS=([0]="0") POSIXLY_CORRECT=y PPID=188717 PS4='+ ' PWD=/ SHELL=/bin/bash SHELLOPTS=braceexpand:errexit:hashall:interactive-comments:posix SHLVL=3 SUDO_COMMAND='/usr/bin/timeout -k 24.1h 24h /usr/bin/ionice -c 3 /usr/bin/nice -n 11 /usr/bin/unshare --uts -- /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/pbuilderrc_Wxxh --distribution forky --hookdir /etc/pbuilder/rebuild-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/forky-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b2 --logfile b2/build.log sparql-wrapper-python_2.0.0-2.dsc' SUDO_GID=109 SUDO_HOME=/var/lib/jenkins SUDO_UID=104 SUDO_USER=jenkins TERM=unknown TZ=/usr/share/zoneinfo/Etc/GMT-14 UID=0 USER=root _='I: set' http_proxy=http://192.168.101.4:3128 I: uname -a Linux i-capture-the-hostname 6.12.48+deb13-cloud-arm64 #1 SMP Debian 6.12.48-1 (2025-09-20) aarch64 GNU/Linux I: ls -l /bin lrwxrwxrwx 1 root root 7 Aug 10 2025 /bin -> usr/bin I: user script /srv/workspace/pbuilder/188717/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy Version: 0.invalid.0 Architecture: arm64 Maintainer: Debian Pbuilder Team Description: Dummy package to satisfy dependencies with aptitude - created by pbuilder This package was created automatically by pbuilder to satisfy the build-dependencies of the package being currently built. Depends: debhelper-compat (= 13), dh-sequence-python3, python3-all, python3-pytest, python3-setuptools, python3-rdflib dpkg-deb: building package 'pbuilder-satisfydepends-dummy' in '/tmp/satisfydepends-aptitude/pbuilder-satisfydepends-dummy.deb'. Selecting previously unselected package pbuilder-satisfydepends-dummy. (Reading database ... 19971 files and directories currently installed.) Preparing to unpack .../pbuilder-satisfydepends-dummy.deb ... Unpacking pbuilder-satisfydepends-dummy (0.invalid.0) ... dpkg: pbuilder-satisfydepends-dummy: dependency problems, but configuring anyway as you requested: pbuilder-satisfydepends-dummy depends on debhelper-compat (= 13); however: Package debhelper-compat is not installed. pbuilder-satisfydepends-dummy depends on dh-sequence-python3; however: Package dh-sequence-python3 is not installed. pbuilder-satisfydepends-dummy depends on python3-all; however: Package python3-all is not installed. pbuilder-satisfydepends-dummy depends on python3-pytest; however: Package python3-pytest is not installed. pbuilder-satisfydepends-dummy depends on python3-setuptools; however: Package python3-setuptools is not installed. pbuilder-satisfydepends-dummy depends on python3-rdflib; however: Package python3-rdflib is not installed. Setting up pbuilder-satisfydepends-dummy (0.invalid.0) ... Reading package lists... Building dependency tree... Reading state information... Initializing package states... Writing extended state information... Building tag database... pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) pbuilder-satisfydepends-dummy is already installed at the requested version (0.invalid.0) The following NEW packages will be installed: autoconf{a} automake{a} autopoint{a} autotools-dev{a} bsdextrautils{a} debhelper{a} dh-autoreconf{a} dh-python{a} dh-strip-nondeterminism{a} dwz{a} file{a} gettext{a} gettext-base{a} groff-base{a} intltool-debian{a} libarchive-zip-perl{a} libdebhelper-perl{a} libelf1t64{a} libexpat1{a} libffi8{a} libfile-stripnondeterminism-perl{a} libmagic-mgc{a} libmagic1t64{a} libpipeline1{a} libpython3-stdlib{a} libpython3.13-minimal{a} libpython3.13-stdlib{a} libreadline8t64{a} libtool{a} libuchardet0{a} libunistring5{a} libxml2-16{a} m4{a} man-db{a} media-types{a} netbase{a} po-debconf{a} python3{a} python3-all{a} python3-autocommand{a} python3-inflect{a} python3-iniconfig{a} python3-jaraco.context{a} python3-jaraco.functools{a} python3-jaraco.text{a} python3-minimal{a} python3-more-itertools{a} python3-packaging{a} python3-pkg-resources{a} python3-pluggy{a} python3-pygments{a} python3-pyparsing{a} python3-pytest{a} python3-rdflib{a} python3-setuptools{a} python3-typeguard{a} python3-typing-extensions{a} python3-zipp{a} python3.13{a} python3.13-minimal{a} readline-common{a} sensible-utils{a} tzdata{a} The following packages are RECOMMENDED but will NOT be installed: ca-certificates curl libarchive-cpio-perl libltdl-dev libmail-sendmail-perl lynx python3-html5rdf python3-lxml python3-networkx python3-orjson wget 0 packages upgraded, 63 newly installed, 0 to remove and 0 not upgraded. Need to get 20.5 MB of archives. After unpacking 88.4 MB will be used. Writing extended state information... Get: 1 http://deb.debian.org/debian forky/main arm64 libexpat1 arm64 2.7.3-1 [96.5 kB] Get: 2 http://deb.debian.org/debian forky/main arm64 libpython3.13-minimal arm64 3.13.9-1 [858 kB] Get: 3 http://deb.debian.org/debian forky/main arm64 python3.13-minimal arm64 3.13.9-1 [2061 kB] Get: 4 http://deb.debian.org/debian forky/main arm64 python3-minimal arm64 3.13.7-1 [27.2 kB] Get: 5 http://deb.debian.org/debian forky/main arm64 media-types all 14.0.0 [30.8 kB] Get: 6 http://deb.debian.org/debian forky/main arm64 netbase all 6.5 [12.4 kB] Get: 7 http://deb.debian.org/debian forky/main arm64 tzdata all 2025b-5 [260 kB] Get: 8 http://deb.debian.org/debian forky/main arm64 libffi8 arm64 3.5.2-2 [21.5 kB] Get: 9 http://deb.debian.org/debian forky/main arm64 readline-common all 8.3-3 [74.8 kB] Get: 10 http://deb.debian.org/debian forky/main arm64 libreadline8t64 arm64 8.3-3 [169 kB] Get: 11 http://deb.debian.org/debian forky/main arm64 libpython3.13-stdlib arm64 3.13.9-1 [1900 kB] Get: 12 http://deb.debian.org/debian forky/main arm64 python3.13 arm64 3.13.9-1 [764 kB] Get: 13 http://deb.debian.org/debian forky/main arm64 libpython3-stdlib arm64 3.13.7-1 [10.2 kB] Get: 14 http://deb.debian.org/debian forky/main arm64 python3 arm64 3.13.7-1 [28.3 kB] Get: 15 http://deb.debian.org/debian forky/main arm64 sensible-utils all 0.0.26 [27.0 kB] Get: 16 http://deb.debian.org/debian forky/main arm64 libmagic-mgc arm64 1:5.46-5 [338 kB] Get: 17 http://deb.debian.org/debian forky/main arm64 libmagic1t64 arm64 1:5.46-5 [103 kB] Get: 18 http://deb.debian.org/debian forky/main arm64 file arm64 1:5.46-5 [43.7 kB] Get: 19 http://deb.debian.org/debian forky/main arm64 gettext-base arm64 0.23.1-2+b1 [241 kB] Get: 20 http://deb.debian.org/debian forky/main arm64 libuchardet0 arm64 0.0.8-2 [69.0 kB] Get: 21 http://deb.debian.org/debian forky/main arm64 groff-base arm64 1.23.0-9 [1130 kB] Get: 22 http://deb.debian.org/debian forky/main arm64 bsdextrautils arm64 2.41.2-4 [97.3 kB] Get: 23 http://deb.debian.org/debian forky/main arm64 libpipeline1 arm64 1.5.8-1 [40.2 kB] Get: 24 http://deb.debian.org/debian forky/main arm64 man-db arm64 2.13.1-1 [1453 kB] Get: 25 http://deb.debian.org/debian forky/main arm64 m4 arm64 1.4.20-2 [315 kB] Get: 26 http://deb.debian.org/debian forky/main arm64 autoconf all 2.72-3.1 [494 kB] Get: 27 http://deb.debian.org/debian forky/main arm64 autotools-dev all 20240727.1 [60.2 kB] Get: 28 http://deb.debian.org/debian forky/main arm64 automake all 1:1.18.1-2 [877 kB] Get: 29 http://deb.debian.org/debian forky/main arm64 autopoint all 0.23.1-2 [770 kB] Get: 30 http://deb.debian.org/debian forky/main arm64 libdebhelper-perl all 13.28 [92.4 kB] Get: 31 http://deb.debian.org/debian forky/main arm64 libtool all 2.5.4-7 [540 kB] Get: 32 http://deb.debian.org/debian forky/main arm64 dh-autoreconf all 21 [12.2 kB] Get: 33 http://deb.debian.org/debian forky/main arm64 libarchive-zip-perl all 1.68-1 [104 kB] Get: 34 http://deb.debian.org/debian forky/main arm64 libfile-stripnondeterminism-perl all 1.15.0-1 [19.9 kB] Get: 35 http://deb.debian.org/debian forky/main arm64 dh-strip-nondeterminism all 1.15.0-1 [8812 B] Get: 36 http://deb.debian.org/debian forky/main arm64 libelf1t64 arm64 0.193-3 [189 kB] Get: 37 http://deb.debian.org/debian forky/main arm64 dwz arm64 0.16-2 [100 kB] Get: 38 http://deb.debian.org/debian forky/main arm64 libunistring5 arm64 1.3-2 [453 kB] Get: 39 http://deb.debian.org/debian forky/main arm64 libxml2-16 arm64 2.14.6+dfsg-0.1 [601 kB] Get: 40 http://deb.debian.org/debian forky/main arm64 gettext arm64 0.23.1-2+b1 [1612 kB] Get: 41 http://deb.debian.org/debian forky/main arm64 intltool-debian all 0.35.0+20060710.6 [22.9 kB] Get: 42 http://deb.debian.org/debian forky/main arm64 po-debconf all 1.0.21+nmu1 [248 kB] Get: 43 http://deb.debian.org/debian forky/main arm64 debhelper all 13.28 [941 kB] Get: 44 http://deb.debian.org/debian forky/main arm64 dh-python all 6.20250414 [116 kB] Get: 45 http://deb.debian.org/debian forky/main arm64 python3-all arm64 3.13.7-1 [1044 B] Get: 46 http://deb.debian.org/debian forky/main arm64 python3-autocommand all 2.2.2-3 [13.6 kB] Get: 47 http://deb.debian.org/debian forky/main arm64 python3-more-itertools all 10.8.0-1 [71.7 kB] Get: 48 http://deb.debian.org/debian forky/main arm64 python3-typing-extensions all 4.15.0-1 [92.4 kB] Get: 49 http://deb.debian.org/debian forky/main arm64 python3-typeguard all 4.4.4-1 [37.1 kB] Get: 50 http://deb.debian.org/debian forky/main arm64 python3-inflect all 7.5.0-1 [33.0 kB] Get: 51 http://deb.debian.org/debian forky/main arm64 python3-iniconfig all 2.1.0-1 [7432 B] Get: 52 http://deb.debian.org/debian forky/main arm64 python3-jaraco.functools all 4.1.0-1 [12.0 kB] Get: 53 http://deb.debian.org/debian forky/main arm64 python3-pkg-resources all 78.1.1-0.1 [224 kB] Get: 54 http://deb.debian.org/debian forky/main arm64 python3-jaraco.text all 4.0.0-1 [11.4 kB] Get: 55 http://deb.debian.org/debian forky/main arm64 python3-zipp all 3.23.0-1 [11.0 kB] Get: 56 http://deb.debian.org/debian forky/main arm64 python3-setuptools all 78.1.1-0.1 [738 kB] Get: 57 http://deb.debian.org/debian forky/main arm64 python3-jaraco.context all 6.0.1-1 [8276 B] Get: 58 http://deb.debian.org/debian forky/main arm64 python3-packaging all 25.0-1 [56.6 kB] Get: 59 http://deb.debian.org/debian forky/main arm64 python3-pluggy all 1.6.0-1 [27.1 kB] Get: 60 http://deb.debian.org/debian forky/main arm64 python3-pygments all 2.18.0+dfsg-2 [836 kB] Get: 61 http://deb.debian.org/debian forky/main arm64 python3-pyparsing all 3.1.3-1 [148 kB] Get: 62 http://deb.debian.org/debian forky/main arm64 python3-pytest all 8.4.2-1 [266 kB] Get: 63 http://deb.debian.org/debian forky/main arm64 python3-rdflib all 7.1.1-3 [472 kB] Fetched 20.5 MB in 0s (114 MB/s) Preconfiguring packages ... Selecting previously unselected package libexpat1:arm64. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19971 files and directories currently installed.) Preparing to unpack .../libexpat1_2.7.3-1_arm64.deb ... Unpacking libexpat1:arm64 (2.7.3-1) ... Selecting previously unselected package libpython3.13-minimal:arm64. Preparing to unpack .../libpython3.13-minimal_3.13.9-1_arm64.deb ... Unpacking libpython3.13-minimal:arm64 (3.13.9-1) ... Selecting previously unselected package python3.13-minimal. Preparing to unpack .../python3.13-minimal_3.13.9-1_arm64.deb ... Unpacking python3.13-minimal (3.13.9-1) ... Setting up libpython3.13-minimal:arm64 (3.13.9-1) ... Setting up libexpat1:arm64 (2.7.3-1) ... Setting up python3.13-minimal (3.13.9-1) ... Selecting previously unselected package python3-minimal. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 20305 files and directories currently installed.) Preparing to unpack .../0-python3-minimal_3.13.7-1_arm64.deb ... Unpacking python3-minimal (3.13.7-1) ... Selecting previously unselected package media-types. Preparing to unpack .../1-media-types_14.0.0_all.deb ... Unpacking media-types (14.0.0) ... Selecting previously unselected package netbase. Preparing to unpack .../2-netbase_6.5_all.deb ... Unpacking netbase (6.5) ... Selecting previously unselected package tzdata. Preparing to unpack .../3-tzdata_2025b-5_all.deb ... Unpacking tzdata (2025b-5) ... Selecting previously unselected package libffi8:arm64. Preparing to unpack .../4-libffi8_3.5.2-2_arm64.deb ... Unpacking libffi8:arm64 (3.5.2-2) ... Selecting previously unselected package readline-common. Preparing to unpack .../5-readline-common_8.3-3_all.deb ... Unpacking readline-common (8.3-3) ... Selecting previously unselected package libreadline8t64:arm64. Preparing to unpack .../6-libreadline8t64_8.3-3_arm64.deb ... Adding 'diversion of /lib/aarch64-linux-gnu/libhistory.so.8 to /lib/aarch64-linux-gnu/libhistory.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/aarch64-linux-gnu/libhistory.so.8.2 to /lib/aarch64-linux-gnu/libhistory.so.8.2.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/aarch64-linux-gnu/libreadline.so.8 to /lib/aarch64-linux-gnu/libreadline.so.8.usr-is-merged by libreadline8t64' Adding 'diversion of /lib/aarch64-linux-gnu/libreadline.so.8.2 to /lib/aarch64-linux-gnu/libreadline.so.8.2.usr-is-merged by libreadline8t64' Unpacking libreadline8t64:arm64 (8.3-3) ... Selecting previously unselected package libpython3.13-stdlib:arm64. Preparing to unpack .../7-libpython3.13-stdlib_3.13.9-1_arm64.deb ... Unpacking libpython3.13-stdlib:arm64 (3.13.9-1) ... Selecting previously unselected package python3.13. Preparing to unpack .../8-python3.13_3.13.9-1_arm64.deb ... Unpacking python3.13 (3.13.9-1) ... Selecting previously unselected package libpython3-stdlib:arm64. Preparing to unpack .../9-libpython3-stdlib_3.13.7-1_arm64.deb ... Unpacking libpython3-stdlib:arm64 (3.13.7-1) ... Setting up python3-minimal (3.13.7-1) ... Selecting previously unselected package python3. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 21320 files and directories currently installed.) Preparing to unpack .../00-python3_3.13.7-1_arm64.deb ... Unpacking python3 (3.13.7-1) ... Selecting previously unselected package sensible-utils. Preparing to unpack .../01-sensible-utils_0.0.26_all.deb ... Unpacking sensible-utils (0.0.26) ... Selecting previously unselected package libmagic-mgc. Preparing to unpack .../02-libmagic-mgc_1%3a5.46-5_arm64.deb ... Unpacking libmagic-mgc (1:5.46-5) ... Selecting previously unselected package libmagic1t64:arm64. Preparing to unpack .../03-libmagic1t64_1%3a5.46-5_arm64.deb ... Unpacking libmagic1t64:arm64 (1:5.46-5) ... Selecting previously unselected package file. Preparing to unpack .../04-file_1%3a5.46-5_arm64.deb ... Unpacking file (1:5.46-5) ... Selecting previously unselected package gettext-base. Preparing to unpack .../05-gettext-base_0.23.1-2+b1_arm64.deb ... Unpacking gettext-base (0.23.1-2+b1) ... Selecting previously unselected package libuchardet0:arm64. Preparing to unpack .../06-libuchardet0_0.0.8-2_arm64.deb ... Unpacking libuchardet0:arm64 (0.0.8-2) ... Selecting previously unselected package groff-base. Preparing to unpack .../07-groff-base_1.23.0-9_arm64.deb ... Unpacking groff-base (1.23.0-9) ... Selecting previously unselected package bsdextrautils. Preparing to unpack .../08-bsdextrautils_2.41.2-4_arm64.deb ... Unpacking bsdextrautils (2.41.2-4) ... Selecting previously unselected package libpipeline1:arm64. Preparing to unpack .../09-libpipeline1_1.5.8-1_arm64.deb ... Unpacking libpipeline1:arm64 (1.5.8-1) ... Selecting previously unselected package man-db. Preparing to unpack .../10-man-db_2.13.1-1_arm64.deb ... Unpacking man-db (2.13.1-1) ... Selecting previously unselected package m4. Preparing to unpack .../11-m4_1.4.20-2_arm64.deb ... Unpacking m4 (1.4.20-2) ... Selecting previously unselected package autoconf. Preparing to unpack .../12-autoconf_2.72-3.1_all.deb ... Unpacking autoconf (2.72-3.1) ... Selecting previously unselected package autotools-dev. Preparing to unpack .../13-autotools-dev_20240727.1_all.deb ... Unpacking autotools-dev (20240727.1) ... Selecting previously unselected package automake. Preparing to unpack .../14-automake_1%3a1.18.1-2_all.deb ... Unpacking automake (1:1.18.1-2) ... Selecting previously unselected package autopoint. Preparing to unpack .../15-autopoint_0.23.1-2_all.deb ... Unpacking autopoint (0.23.1-2) ... Selecting previously unselected package libdebhelper-perl. Preparing to unpack .../16-libdebhelper-perl_13.28_all.deb ... Unpacking libdebhelper-perl (13.28) ... Selecting previously unselected package libtool. Preparing to unpack .../17-libtool_2.5.4-7_all.deb ... Unpacking libtool (2.5.4-7) ... Selecting previously unselected package dh-autoreconf. Preparing to unpack .../18-dh-autoreconf_21_all.deb ... Unpacking dh-autoreconf (21) ... Selecting previously unselected package libarchive-zip-perl. Preparing to unpack .../19-libarchive-zip-perl_1.68-1_all.deb ... Unpacking libarchive-zip-perl (1.68-1) ... Selecting previously unselected package libfile-stripnondeterminism-perl. Preparing to unpack .../20-libfile-stripnondeterminism-perl_1.15.0-1_all.deb ... Unpacking libfile-stripnondeterminism-perl (1.15.0-1) ... Selecting previously unselected package dh-strip-nondeterminism. Preparing to unpack .../21-dh-strip-nondeterminism_1.15.0-1_all.deb ... Unpacking dh-strip-nondeterminism (1.15.0-1) ... Selecting previously unselected package libelf1t64:arm64. Preparing to unpack .../22-libelf1t64_0.193-3_arm64.deb ... Unpacking libelf1t64:arm64 (0.193-3) ... Selecting previously unselected package dwz. Preparing to unpack .../23-dwz_0.16-2_arm64.deb ... Unpacking dwz (0.16-2) ... Selecting previously unselected package libunistring5:arm64. Preparing to unpack .../24-libunistring5_1.3-2_arm64.deb ... Unpacking libunistring5:arm64 (1.3-2) ... Selecting previously unselected package libxml2-16:arm64. Preparing to unpack .../25-libxml2-16_2.14.6+dfsg-0.1_arm64.deb ... Unpacking libxml2-16:arm64 (2.14.6+dfsg-0.1) ... Selecting previously unselected package gettext. Preparing to unpack .../26-gettext_0.23.1-2+b1_arm64.deb ... Unpacking gettext (0.23.1-2+b1) ... Selecting previously unselected package intltool-debian. Preparing to unpack .../27-intltool-debian_0.35.0+20060710.6_all.deb ... Unpacking intltool-debian (0.35.0+20060710.6) ... Selecting previously unselected package po-debconf. Preparing to unpack .../28-po-debconf_1.0.21+nmu1_all.deb ... Unpacking po-debconf (1.0.21+nmu1) ... Selecting previously unselected package debhelper. Preparing to unpack .../29-debhelper_13.28_all.deb ... Unpacking debhelper (13.28) ... Selecting previously unselected package dh-python. Preparing to unpack .../30-dh-python_6.20250414_all.deb ... Unpacking dh-python (6.20250414) ... Selecting previously unselected package python3-all. Preparing to unpack .../31-python3-all_3.13.7-1_arm64.deb ... Unpacking python3-all (3.13.7-1) ... Selecting previously unselected package python3-autocommand. Preparing to unpack .../32-python3-autocommand_2.2.2-3_all.deb ... Unpacking python3-autocommand (2.2.2-3) ... Selecting previously unselected package python3-more-itertools. Preparing to unpack .../33-python3-more-itertools_10.8.0-1_all.deb ... Unpacking python3-more-itertools (10.8.0-1) ... Selecting previously unselected package python3-typing-extensions. Preparing to unpack .../34-python3-typing-extensions_4.15.0-1_all.deb ... Unpacking python3-typing-extensions (4.15.0-1) ... Selecting previously unselected package python3-typeguard. Preparing to unpack .../35-python3-typeguard_4.4.4-1_all.deb ... Unpacking python3-typeguard (4.4.4-1) ... Selecting previously unselected package python3-inflect. Preparing to unpack .../36-python3-inflect_7.5.0-1_all.deb ... Unpacking python3-inflect (7.5.0-1) ... Selecting previously unselected package python3-iniconfig. Preparing to unpack .../37-python3-iniconfig_2.1.0-1_all.deb ... Unpacking python3-iniconfig (2.1.0-1) ... Selecting previously unselected package python3-jaraco.functools. Preparing to unpack .../38-python3-jaraco.functools_4.1.0-1_all.deb ... Unpacking python3-jaraco.functools (4.1.0-1) ... Selecting previously unselected package python3-pkg-resources. Preparing to unpack .../39-python3-pkg-resources_78.1.1-0.1_all.deb ... Unpacking python3-pkg-resources (78.1.1-0.1) ... Selecting previously unselected package python3-jaraco.text. Preparing to unpack .../40-python3-jaraco.text_4.0.0-1_all.deb ... Unpacking python3-jaraco.text (4.0.0-1) ... Selecting previously unselected package python3-zipp. Preparing to unpack .../41-python3-zipp_3.23.0-1_all.deb ... Unpacking python3-zipp (3.23.0-1) ... Selecting previously unselected package python3-setuptools. Preparing to unpack .../42-python3-setuptools_78.1.1-0.1_all.deb ... Unpacking python3-setuptools (78.1.1-0.1) ... Selecting previously unselected package python3-jaraco.context. Preparing to unpack .../43-python3-jaraco.context_6.0.1-1_all.deb ... Unpacking python3-jaraco.context (6.0.1-1) ... Selecting previously unselected package python3-packaging. Preparing to unpack .../44-python3-packaging_25.0-1_all.deb ... Unpacking python3-packaging (25.0-1) ... Selecting previously unselected package python3-pluggy. Preparing to unpack .../45-python3-pluggy_1.6.0-1_all.deb ... Unpacking python3-pluggy (1.6.0-1) ... Selecting previously unselected package python3-pygments. Preparing to unpack .../46-python3-pygments_2.18.0+dfsg-2_all.deb ... Unpacking python3-pygments (2.18.0+dfsg-2) ... Selecting previously unselected package python3-pyparsing. Preparing to unpack .../47-python3-pyparsing_3.1.3-1_all.deb ... Unpacking python3-pyparsing (3.1.3-1) ... Selecting previously unselected package python3-pytest. Preparing to unpack .../48-python3-pytest_8.4.2-1_all.deb ... Unpacking python3-pytest (8.4.2-1) ... Selecting previously unselected package python3-rdflib. Preparing to unpack .../49-python3-rdflib_7.1.1-3_all.deb ... Unpacking python3-rdflib (7.1.1-3) ... Setting up media-types (14.0.0) ... Setting up libpipeline1:arm64 (1.5.8-1) ... Setting up bsdextrautils (2.41.2-4) ... Setting up libmagic-mgc (1:5.46-5) ... Setting up libarchive-zip-perl (1.68-1) ... Setting up libxml2-16:arm64 (2.14.6+dfsg-0.1) ... Setting up libdebhelper-perl (13.28) ... Setting up libmagic1t64:arm64 (1:5.46-5) ... Setting up gettext-base (0.23.1-2+b1) ... Setting up m4 (1.4.20-2) ... Setting up file (1:5.46-5) ... Setting up libelf1t64:arm64 (0.193-3) ... Setting up tzdata (2025b-5) ... Current default time zone: 'Etc/UTC' Local time is now: Fri Dec 4 04:25:07 UTC 2026. Universal Time is now: Fri Dec 4 04:25:07 UTC 2026. Run 'dpkg-reconfigure tzdata' if you wish to change it. Setting up autotools-dev (20240727.1) ... Setting up libunistring5:arm64 (1.3-2) ... Setting up autopoint (0.23.1-2) ... Setting up autoconf (2.72-3.1) ... Setting up libffi8:arm64 (3.5.2-2) ... Setting up dwz (0.16-2) ... Setting up sensible-utils (0.0.26) ... Setting up libuchardet0:arm64 (0.0.8-2) ... Setting up netbase (6.5) ... Setting up readline-common (8.3-3) ... Setting up automake (1:1.18.1-2) ... update-alternatives: using /usr/bin/automake-1.18 to provide /usr/bin/automake (automake) in auto mode Setting up libfile-stripnondeterminism-perl (1.15.0-1) ... Setting up gettext (0.23.1-2+b1) ... Setting up libtool (2.5.4-7) ... Setting up intltool-debian (0.35.0+20060710.6) ... Setting up dh-autoreconf (21) ... Setting up libreadline8t64:arm64 (8.3-3) ... Setting up dh-strip-nondeterminism (1.15.0-1) ... Setting up groff-base (1.23.0-9) ... Setting up libpython3.13-stdlib:arm64 (3.13.9-1) ... Setting up libpython3-stdlib:arm64 (3.13.7-1) ... Setting up python3.13 (3.13.9-1) ... Setting up po-debconf (1.0.21+nmu1) ... Setting up python3 (3.13.7-1) ... Setting up python3-zipp (3.23.0-1) ... Setting up python3-autocommand (2.2.2-3) ... Setting up man-db (2.13.1-1) ... Not building database; man-db/auto-update is not 'true'. Setting up python3-pygments (2.18.0+dfsg-2) ... Setting up python3-packaging (25.0-1) ... Setting up python3-pyparsing (3.1.3-1) ... Setting up python3-typing-extensions (4.15.0-1) ... Setting up python3-pluggy (1.6.0-1) ... Setting up python3-rdflib (7.1.1-3) ... Setting up dh-python (6.20250414) ... Setting up python3-more-itertools (10.8.0-1) ... Setting up python3-iniconfig (2.1.0-1) ... Setting up python3-jaraco.functools (4.1.0-1) ... Setting up python3-jaraco.context (6.0.1-1) ... Setting up python3-pytest (8.4.2-1) ... Setting up python3-typeguard (4.4.4-1) ... Setting up python3-all (3.13.7-1) ... Setting up debhelper (13.28) ... Setting up python3-inflect (7.5.0-1) ... Setting up python3-jaraco.text (4.0.0-1) ... Setting up python3-pkg-resources (78.1.1-0.1) ... Setting up python3-setuptools (78.1.1-0.1) ... Processing triggers for libc-bin (2.41-12) ... Reading package lists... Building dependency tree... Reading state information... Reading extended state information... Initializing package states... Writing extended state information... Building tag database... -> Finished parsing the build-deps I: Building the package I: user script /srv/workspace/pbuilder/188717/tmp/hooks/A99_set_merged_usr starting Not re-configuring usrmerge for forky I: user script /srv/workspace/pbuilder/188717/tmp/hooks/A99_set_merged_usr finished hostname: Name or service not known I: Running cd /build/reproducible-path/sparql-wrapper-python-2.0.0/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-genchanges -S > ../sparql-wrapper-python_2.0.0-2_source.changes dpkg-buildpackage: info: source package sparql-wrapper-python dpkg-buildpackage: info: source version 2.0.0-2 dpkg-buildpackage: info: source distribution unstable dpkg-buildpackage: info: source changed by Alexandre Detiste dpkg-source --before-build . dpkg-buildpackage: info: host architecture arm64 debian/rules clean dh clean --buildsystem=pybuild dh_auto_clean -O--buildsystem=pybuild I: pybuild base:311: python3.13 setup.py clean /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running clean removing '/build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build' (and everything under it) 'build/bdist.linux-aarch64' does not exist -- can't clean it 'build/scripts-3.13' does not exist -- can't clean it dh_autoreconf_clean -O--buildsystem=pybuild dh_clean -O--buildsystem=pybuild debian/rules binary dh binary --buildsystem=pybuild dh_update_autotools_config -O--buildsystem=pybuild dh_autoreconf -O--buildsystem=pybuild dh_auto_configure -O--buildsystem=pybuild I: pybuild base:311: python3.13 setup.py config /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running config dh_auto_build -O--buildsystem=pybuild I: pybuild base:311: /usr/bin/python3 setup.py build /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running build running build_py creating /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper I: pybuild pybuild:334: cp -r test /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build dh: command-omitted: The call to "debian/rules override_dh_auto_test" was omitted due to "DEB_BUILD_OPTIONS=nocheck" create-stamp debian/debhelper-build-stamp dh_testroot -O--buildsystem=pybuild dh_prep -O--buildsystem=pybuild dh_auto_install --destdir=debian/python3-sparqlwrapper/ -O--buildsystem=pybuild I: pybuild pybuild:308: rm -fr /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test I: pybuild base:311: /usr/bin/python3 setup.py install --root /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper /usr/lib/python3/dist-packages/setuptools/dist.py:759: SetuptoolsDeprecationWarning: License classifiers are deprecated. !! ******************************************************************************** Please consider removing the following classifiers in favor of a SPDX license expression: License :: OSI Approved :: W3C License See https://packaging.python.org/en/latest/guides/writing-pyproject-toml/#license for details. ******************************************************************************** !! self._finalize_license_expression() running install /usr/lib/python3/dist-packages/setuptools/_distutils/cmd.py:90: SetuptoolsDeprecationWarning: setup.py install is deprecated. !! ******************************************************************************** Please avoid running ``setup.py`` directly. Instead, use pypa/build, pypa/installer or other standards-based tools. See https://blog.ganssle.io/articles/2021/10/setup-py-deprecated.html for details. ******************************************************************************** !! self.initialize_options() running build running build_py running install_lib creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/SmartWrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__init__.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/main.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/sparql_dataframe.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/KeyCaseInsensitiveDict.py to KeyCaseInsensitiveDict.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/SPARQLExceptions.py to SPARQLExceptions.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/SmartWrapper.py to SmartWrapper.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/Wrapper.py to Wrapper.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__init__.py to __init__.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/main.py to main.cpython-313.pyc byte-compiling /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/sparql_dataframe.py to sparql_dataframe.cpython-313.pyc running install_egg_info running egg_info creating SPARQLWrapper.egg-info writing SPARQLWrapper.egg-info/PKG-INFO writing dependency_links to SPARQLWrapper.egg-info/dependency_links.txt writing entry points to SPARQLWrapper.egg-info/entry_points.txt writing requirements to SPARQLWrapper.egg-info/requires.txt writing top-level names to SPARQLWrapper.egg-info/top_level.txt writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' reading manifest template 'MANIFEST.in' warning: no previously-included files found matching 'Makefile' warning: no directories found matching 'docs/build/html' adding license file 'LICENSE.txt' adding license file 'AUTHORS.md' writing manifest file 'SPARQLWrapper.egg-info/SOURCES.txt' Copying SPARQLWrapper.egg-info to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper-2.0.0.egg-info Skipping SOURCES.txt running install_scripts Installing rqw script to /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/bin dh_installdocs -O--buildsystem=pybuild dh_installchangelogs -O--buildsystem=pybuild dh_installexamples -O--buildsystem=pybuild dh_python3 -O--buildsystem=pybuild I: dh_python3 tools:114: replacing shebang in debian/python3-sparqlwrapper/usr/bin/rqw dh_installsystemduser -O--buildsystem=pybuild dh_perl -O--buildsystem=pybuild dh_link -O--buildsystem=pybuild dh_strip_nondeterminism -O--buildsystem=pybuild dh_compress -O--buildsystem=pybuild dh_fixperms -O--buildsystem=pybuild dh_missing -O--buildsystem=pybuild dh_installdeb -O--buildsystem=pybuild dh_gencontrol -O--buildsystem=pybuild dh_md5sums -O--buildsystem=pybuild dh_builddeb -O--buildsystem=pybuild dpkg-deb: building package 'python3-sparqlwrapper' in '../python3-sparqlwrapper_2.0.0-2_all.deb'. dpkg-genbuildinfo --build=binary -O../sparql-wrapper-python_2.0.0-2_arm64.buildinfo dpkg-genchanges --build=binary -O../sparql-wrapper-python_2.0.0-2_arm64.changes dpkg-genchanges: info: binary-only upload (no source code included) dpkg-source --after-build . dpkg-buildpackage: info: binary-only upload (no source included) dpkg-genchanges: info: not including original source code in upload I: copying local configuration I: user script /srv/workspace/pbuilder/188717/tmp/hooks/B01_cleanup starting I: user script /srv/workspace/pbuilder/188717/tmp/hooks/B01_cleanup finished I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env I: removing directory /srv/workspace/pbuilder/188717 and its subdirectories I: Current time: Fri Dec 4 18:25:26 +14 2026 I: pbuilder-time-stamp: 1796358326 + false + set +x Fri Dec 4 04:25:26 UTC 2026 I: Signing ./b2/sparql-wrapper-python_2.0.0-2_arm64.buildinfo as sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc Fri Dec 4 04:25:26 UTC 2026 I: Signed ./b2/sparql-wrapper-python_2.0.0-2_arm64.buildinfo as ./b2/sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc Fri Dec 4 04:25:26 UTC 2026 - build #2 for sparql-wrapper-python/forky/arm64 on codethink03-arm64 done. Starting cleanup. All cleanup done. Fri Dec 4 04:25:26 UTC 2026 - reproducible_build.sh stopped running as /tmp/jenkins-script-CqsiKTZG, removing. /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy: total 7708 drwxrwxr-x 2 jenkins jenkins 4096 Oct 31 22:01 b1 drwxrwxr-x 2 jenkins jenkins 4096 Oct 31 22:02 b2 -rw------- 1 jenkins jenkins 7865915 Oct 31 22:01 rbuildlog.r4FpzJa -rw-rw-r-- 1 jenkins jenkins 2214 Jun 26 2024 sparql-wrapper-python_2.0.0-2.dsc /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b1: total 7768 -rw-r--r-- 1 jenkins jenkins 7862197 Oct 31 22:01 build.log -rw-r--r-- 1 jenkins jenkins 39232 Oct 31 22:01 python3-sparqlwrapper_2.0.0-2_all.deb -rw-r--r-- 1 jenkins jenkins 5692 Oct 31 22:01 sparql-wrapper-python_2.0.0-2.debian.tar.xz -rw-r--r-- 1 jenkins jenkins 2214 Oct 31 22:01 sparql-wrapper-python_2.0.0-2.dsc -rw-r--r-- 1 jenkins jenkins 5639 Oct 31 22:01 sparql-wrapper-python_2.0.0-2_arm64.buildinfo -rw-rw-r-- 1 jenkins jenkins 6521 Oct 31 22:01 sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc -rw-r--r-- 1 jenkins jenkins 1127 Oct 31 22:01 sparql-wrapper-python_2.0.0-2_arm64.changes -rw-r--r-- 1 jenkins jenkins 1315 Oct 31 22:01 sparql-wrapper-python_2.0.0-2_source.changes /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b2: total 120 -rw-rw-r-- 1 jenkins jenkins 43623 Oct 31 22:02 build.log -rw-r--r-- 1 jenkins jenkins 39232 Oct 31 22:02 python3-sparqlwrapper_2.0.0-2_all.deb -rw-r--r-- 1 jenkins jenkins 5692 Oct 31 22:02 sparql-wrapper-python_2.0.0-2.debian.tar.xz -rw-r--r-- 1 jenkins jenkins 2214 Oct 31 22:02 sparql-wrapper-python_2.0.0-2.dsc -rw-rw-r-- 1 jenkins jenkins 5647 Oct 31 22:02 sparql-wrapper-python_2.0.0-2_arm64.buildinfo -rw-rw-r-- 1 jenkins jenkins 6529 Oct 31 22:02 sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc -rw-rw-r-- 1 jenkins jenkins 1127 Oct 31 22:02 sparql-wrapper-python_2.0.0-2_arm64.changes -rw-rw-r-- 1 jenkins jenkins 1315 Oct 31 22:02 sparql-wrapper-python_2.0.0-2_source.changes Fri Oct 31 22:02:27 UTC 2025 I: Deleting $TMPDIR on codethink03-arm64.debian.net. Fri Oct 31 22:02:28 UTC 2025 I: sparql-wrapper-python_2.0.0-2_arm64.changes: Format: 1.8 Date: Wed, 26 Jun 2024 09:15:38 +0200 Source: sparql-wrapper-python Binary: python3-sparqlwrapper Architecture: all Version: 2.0.0-2 Distribution: unstable Urgency: medium Maintainer: Debian Python Team Changed-By: Alexandre Detiste Description: python3-sparqlwrapper - SPARQL endpoint interface to Python3 Changes: sparql-wrapper-python (2.0.0-2) unstable; urgency=medium . * Team upload. * Release to unstable Checksums-Sha1: 64adebaed2f378de11838e748cbde46bd71db82c 39232 python3-sparqlwrapper_2.0.0-2_all.deb 220707387d9ab2f5c0c01b5c2e875c26452f7cc5 5639 sparql-wrapper-python_2.0.0-2_arm64.buildinfo Checksums-Sha256: 7ac5bf47a01b741c0976370c4d658c6c6996d4c9ed18bf783b837d9bffde4ab5 39232 python3-sparqlwrapper_2.0.0-2_all.deb 91648502505ae87703d649af230f635a29e979411117245b761df0aa5cb7cb90 5639 sparql-wrapper-python_2.0.0-2_arm64.buildinfo Files: 527ef6a0a6b181c68ee26c617d79a226 39232 python optional python3-sparqlwrapper_2.0.0-2_all.deb 7df9281ae11f58b0e1e912fdb7409dc1 5639 python optional sparql-wrapper-python_2.0.0-2_arm64.buildinfo removed '/var/lib/jenkins/userContent/reproducible/debian/rbuild/forky/arm64/sparql-wrapper-python_2.0.0-2.rbuild.log' removed '/var/lib/jenkins/userContent/reproducible/debian/rbuild/forky/arm64/sparql-wrapper-python_2.0.0-2.rbuild.log.gz' removed '/var/lib/jenkins/userContent/reproducible/debian/logs/forky/arm64/sparql-wrapper-python_2.0.0-2.build1.log.gz' removed '/var/lib/jenkins/userContent/reproducible/debian/logs/forky/arm64/sparql-wrapper-python_2.0.0-2.build2.log.gz' removed '/var/lib/jenkins/userContent/reproducible/debian/buildinfo/forky/arm64/sparql-wrapper-python_2.0.0-2_arm64.buildinfo' removed '/var/lib/jenkins/userContent/reproducible/debian/logdiffs/forky/arm64/sparql-wrapper-python_2.0.0-2.diff.gz' Diff of the two buildlogs: -- --- b1/build.log 2025-10-31 22:01:46.323132191 +0000 +++ b2/build.log 2025-10-31 22:02:27.467183772 +0000 @@ -1,6 +1,6 @@ I: pbuilder: network access will be disabled during build -I: Current time: Fri Oct 31 09:48:24 -12 2025 -I: pbuilder-time-stamp: 1761947304 +I: Current time: Fri Dec 4 18:24:48 +14 2026 +I: pbuilder-time-stamp: 1796358288 I: Building the build Environment I: extracting base tarball [/var/cache/pbuilder/forky-reproducible-base.tgz] I: copying local configuration @@ -22,53 +22,85 @@ dpkg-source: info: unpacking sparql-wrapper-python_2.0.0-2.debian.tar.xz I: Not using root during the build. I: Installing the build-deps -I: user script /srv/workspace/pbuilder/4132079/tmp/hooks/D02_print_environment starting +I: user script /srv/workspace/pbuilder/188717/tmp/hooks/D01_modify_environment starting +debug: Running on codethink03-arm64. +I: Changing host+domainname to test build reproducibility +I: Adding a custom variable just for the fun of it... +I: Changing /bin/sh to bash +'/bin/sh' -> '/bin/bash' +lrwxrwxrwx 1 root root 9 Dec 4 04:24 /bin/sh -> /bin/bash +I: Setting pbuilder2's login shell to /bin/bash +I: Setting pbuilder2's GECOS to second user,second room,second work-phone,second home-phone,second other +I: user script /srv/workspace/pbuilder/188717/tmp/hooks/D01_modify_environment finished +I: user script /srv/workspace/pbuilder/188717/tmp/hooks/D02_print_environment starting I: set - BUILDDIR='/build/reproducible-path' - BUILDUSERGECOS='first user,first room,first work-phone,first home-phone,first other' - BUILDUSERNAME='pbuilder1' - BUILD_ARCH='arm64' - DEBIAN_FRONTEND='noninteractive' - DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=12 ' - DISTRIBUTION='forky' - HOME='/root' - HOST_ARCH='arm64' + BASH=/bin/sh + BASHOPTS=checkwinsize:cmdhist:complete_fullquote:extquote:force_fignore:globasciiranges:globskipdots:hostcomplete:interactive_comments:patsub_replacement:progcomp:promptvars:sourcepath + BASH_ALIASES=() + BASH_ARGC=() + BASH_ARGV=() + BASH_CMDS=() + BASH_LINENO=([0]="12" [1]="0") + BASH_LOADABLES_PATH=/usr/local/lib/bash:/usr/lib/bash:/opt/local/lib/bash:/usr/pkg/lib/bash:/opt/pkg/lib/bash:. + BASH_SOURCE=([0]="/tmp/hooks/D02_print_environment" [1]="/tmp/hooks/D02_print_environment") + BASH_VERSINFO=([0]="5" [1]="3" [2]="3" [3]="1" [4]="release" [5]="aarch64-unknown-linux-gnu") + BASH_VERSION='5.3.3(1)-release' + BUILDDIR=/build/reproducible-path + BUILDUSERGECOS='second user,second room,second work-phone,second home-phone,second other' + BUILDUSERNAME=pbuilder2 + BUILD_ARCH=arm64 + DEBIAN_FRONTEND=noninteractive + DEB_BUILD_OPTIONS='buildinfo=+all reproducible=+all parallel=12 nocheck' + DIRSTACK=() + DISTRIBUTION=forky + EUID=0 + FUNCNAME=([0]="Echo" [1]="main") + GROUPS=() + HOME=/root + HOSTNAME=i-capture-the-hostname + HOSTTYPE=aarch64 + HOST_ARCH=arm64 IFS=' ' - INVOCATION_ID='7c7dec4a095148debb57cd30c0788beb' - LANG='C' - LANGUAGE='en_US:en' - LC_ALL='C' - MAIL='/var/mail/root' - OPTIND='1' - PATH='/usr/sbin:/usr/bin:/sbin:/bin:/usr/games' - PBCURRENTCOMMANDLINEOPERATION='build' - PBUILDER_OPERATION='build' - PBUILDER_PKGDATADIR='/usr/share/pbuilder' - PBUILDER_PKGLIBDIR='/usr/lib/pbuilder' - PBUILDER_SYSCONFDIR='/etc' - PPID='4132079' - PS1='# ' - PS2='> ' + INVOCATION_ID=f2d1fa8e896c4d6d9efa9d9bdbcf9f6f + LANG=C + LANGUAGE=nl_BE:nl + LC_ALL=C + MACHTYPE=aarch64-unknown-linux-gnu + MAIL=/var/mail/root + OPTERR=1 + OPTIND=1 + OSTYPE=linux-gnu + PATH=/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path + PBCURRENTCOMMANDLINEOPERATION=build + PBUILDER_OPERATION=build + PBUILDER_PKGDATADIR=/usr/share/pbuilder + PBUILDER_PKGLIBDIR=/usr/lib/pbuilder + PBUILDER_SYSCONFDIR=/etc + PIPESTATUS=([0]="0") + POSIXLY_CORRECT=y + PPID=188717 PS4='+ ' - PWD='/' - SHELL='/bin/bash' - SHLVL='2' - SUDO_COMMAND='/usr/bin/timeout -k 18.1h 18h /usr/bin/ionice -c 3 /usr/bin/nice /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/pbuilderrc_fN2b --distribution forky --hookdir /etc/pbuilder/first-build-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/forky-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b1 --logfile b1/build.log sparql-wrapper-python_2.0.0-2.dsc' - SUDO_GID='109' - SUDO_HOME='/var/lib/jenkins' - SUDO_UID='104' - SUDO_USER='jenkins' - TERM='unknown' - TZ='/usr/share/zoneinfo/Etc/GMT+12' - USER='root' - _='/usr/bin/systemd-run' - http_proxy='http://192.168.101.4:3128' + PWD=/ + SHELL=/bin/bash + SHELLOPTS=braceexpand:errexit:hashall:interactive-comments:posix + SHLVL=3 + SUDO_COMMAND='/usr/bin/timeout -k 24.1h 24h /usr/bin/ionice -c 3 /usr/bin/nice -n 11 /usr/bin/unshare --uts -- /usr/sbin/pbuilder --build --configfile /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/pbuilderrc_Wxxh --distribution forky --hookdir /etc/pbuilder/rebuild-hooks --debbuildopts -b --basetgz /var/cache/pbuilder/forky-reproducible-base.tgz --buildresult /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b2 --logfile b2/build.log sparql-wrapper-python_2.0.0-2.dsc' + SUDO_GID=109 + SUDO_HOME=/var/lib/jenkins + SUDO_UID=104 + SUDO_USER=jenkins + TERM=unknown + TZ=/usr/share/zoneinfo/Etc/GMT-14 + UID=0 + USER=root + _='I: set' + http_proxy=http://192.168.101.4:3128 I: uname -a - Linux codethink04-arm64 6.12.48+deb13-cloud-arm64 #1 SMP Debian 6.12.48-1 (2025-09-20) aarch64 GNU/Linux + Linux i-capture-the-hostname 6.12.48+deb13-cloud-arm64 #1 SMP Debian 6.12.48-1 (2025-09-20) aarch64 GNU/Linux I: ls -l /bin - lrwxrwxrwx 1 root root 7 Aug 10 12:30 /bin -> usr/bin -I: user script /srv/workspace/pbuilder/4132079/tmp/hooks/D02_print_environment finished + lrwxrwxrwx 1 root root 7 Aug 10 2025 /bin -> usr/bin +I: user script /srv/workspace/pbuilder/188717/tmp/hooks/D02_print_environment finished -> Attempting to satisfy build-dependencies -> Creating pbuilder-satisfydepends-dummy package Package: pbuilder-satisfydepends-dummy @@ -177,7 +209,7 @@ Get: 61 http://deb.debian.org/debian forky/main arm64 python3-pyparsing all 3.1.3-1 [148 kB] Get: 62 http://deb.debian.org/debian forky/main arm64 python3-pytest all 8.4.2-1 [266 kB] Get: 63 http://deb.debian.org/debian forky/main arm64 python3-rdflib all 7.1.1-3 [472 kB] -Fetched 20.5 MB in 0s (87.3 MB/s) +Fetched 20.5 MB in 0s (114 MB/s) Preconfiguring packages ... Selecting previously unselected package libexpat1:arm64. (Reading database ... (Reading database ... 5% (Reading database ... 10% (Reading database ... 15% (Reading database ... 20% (Reading database ... 25% (Reading database ... 30% (Reading database ... 35% (Reading database ... 40% (Reading database ... 45% (Reading database ... 50% (Reading database ... 55% (Reading database ... 60% (Reading database ... 65% (Reading database ... 70% (Reading database ... 75% (Reading database ... 80% (Reading database ... 85% (Reading database ... 90% (Reading database ... 95% (Reading database ... 100% (Reading database ... 19971 files and directories currently installed.) @@ -394,8 +426,8 @@ Setting up tzdata (2025b-5) ... Current default time zone: 'Etc/UTC' -Local time is now: Fri Oct 31 21:48:43 UTC 2025. -Universal Time is now: Fri Oct 31 21:48:43 UTC 2025. +Local time is now: Fri Dec 4 04:25:07 UTC 2026. +Universal Time is now: Fri Dec 4 04:25:07 UTC 2026. Run 'dpkg-reconfigure tzdata' if you wish to change it. Setting up autotools-dev (20240727.1) ... @@ -456,7 +488,11 @@ Building tag database... -> Finished parsing the build-deps I: Building the package -I: Running cd /build/reproducible-path/sparql-wrapper-python-2.0.0/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games" HOME="/nonexistent/first-build" dpkg-genchanges -S > ../sparql-wrapper-python_2.0.0-2_source.changes +I: user script /srv/workspace/pbuilder/188717/tmp/hooks/A99_set_merged_usr starting +Not re-configuring usrmerge for forky +I: user script /srv/workspace/pbuilder/188717/tmp/hooks/A99_set_merged_usr finished +hostname: Name or service not known +I: Running cd /build/reproducible-path/sparql-wrapper-python-2.0.0/ && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-buildpackage -us -uc -b && env PATH="/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/i/capture/the/path" HOME="/nonexistent/second-build" dpkg-genchanges -S > ../sparql-wrapper-python_2.0.0-2_source.changes dpkg-buildpackage: info: source package sparql-wrapper-python dpkg-buildpackage: info: source version 2.0.0-2 dpkg-buildpackage: info: source distribution unstable @@ -533,168896 +569,7 @@ copying SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper copying SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper I: pybuild pybuild:334: cp -r test /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build - debian/rules override_dh_auto_test -make[1]: Entering directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' -# tests need a remote server -dh_auto_test || : -I: pybuild base:311: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build; python3.13 -m pytest test -============================= test session starts ============================== -platform linux -- Python 3.13.9, pytest-8.4.2, pluggy-1.6.0 -rootdir: /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build -configfile: pyproject.toml -plugins: typeguard-4.4.4 -collected 1525 items - -test/test_agrovoc-allegrograph_on_hold.py sFxxsFFsFFxsFFxxsFFFFxxsFFFFxx [ 1%] -sFFFFxxsFFFFFFFFssFFFxxFFxFFxxFFF [ 4%] -test/test_allegrograph__v4_14_1__mmi.py ssFFFFFFssFFFFssFFFFFFssFFFFFFss [ 6%] -FFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFssFFFFFFFFssFFFFFF [ 10%] -FFFFFFFFFFFFFFFFFFFFFFF [ 12%] -test/test_blazegraph__wikidata.py ssFFFFFFssFFFFssFFFFFFssFFFFFFsFsFsFFF [ 14%] -sFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFsFsFFFsFFFFFFFsFFsFsFFFFFFFsFF [ 19%] -FFFsFFFFFFFsFFFFF [ 20%] -test/test_cli.py ..F...FFFFFFFFFFFFFFFFFFFFFF [ 22%] -test/test_fuseki2__v3_6_0__agrovoc.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFF [ 24%] -sFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFs [ 29%] -FFFFFFFFFFsFFsFFFFFFF [ 30%] -test/test_fuseki2__v3_8_0__stw.py FFFsFFsFFFFFFFFFFsFFsFFFFFFFsFFFFFsFsF [ 33%] -sFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFsFFFFFsFsFsFFFFFFFFFFsFFFFsFFsFFFF [ 38%] -FFFFFFsFFsFFFFFFF [ 39%] -test/test_graphdbEnterprise__v8_9_0__rs.py ssssFFsFsssFsFssssFFsFsssFsFs [ 41%] -FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFsFFsFsF [ 45%] -ssFFsFsFsFsFsFssFFsFsFsFsF [ 47%] -test/test_lov-fuseki_on_hold.py FFFFFFFFFFFFFFssssssssssssssFFFFFFFFFFFF [ 50%] -FFFFssssssssssssssssFFFFFFFFFFFFFFFFssssssssssssssssFsFFssFFFFFFFFFFFFFF [ 54%] -Fssssssssssssss [ 55%] -test/test_rdf4j__geosciml.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 58%] -FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFFssFFsFsFssFFsFsFsFsFs [ 63%] -FssFFsFsFsFsF [ 64%] -test/test_stardog__lindas.py ssssFFsFsssFsFssssFFsFsssFsFsFsFsFsFsFsFsFs [ 67%] -FsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFsFFssFFFsFsFssFFsFsFsFsFs [ 71%] -FssFFsFsFsFsF [ 72%] -test/test_store__v1_1_4.py FFFsFFsFsFxFxFxxxxxxxxxxxxxxsFsssFsFsFsFxFxFx [ 75%] -xssxxxxxxxxxxxxsFsssFsssFssxFxFxxssxxxxxxxxxxxxFFFFssFFFFsFFsFsFxFxFxxxx [ 80%] -xxxxxxxxxx [ 81%] -test/test_virtuoso__v7_20_3230__dbpedia.py FFFssFssFFFFFFsssssFsssssssss [ 82%] -FFFssFFFFFFFFFFsFssssFssssssFsssFFFssFFFFFFFFFFssssssssssssssssFFFFssFFF [ 87%] -FFFFssFFFFFFsssFFsssssssss [ 89%] -test/test_virtuoso__v8_03_3313__dbpedia.py FFFssFssFFFFFFsssssssssssssss [ 91%] -FFFssFFFFFFFFFFsssssssssssssssssFFFssFFFFFFFFFFssssssssssssssssFFFFFsFFF [ 96%] -FFFFssFFFFFFssssssssssssss [ 97%] -test/test_wrapper.py ....s..........................F... [100%] - -=================================== FAILURES =================================== -____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON(self): -> result = self.__generic(askQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:403: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow(self): -> result = self.__generic(askQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:459: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML(self): -> result = self.__generic(askQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:345: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON(self): -> result = self.__generic(askQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:410: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByPOSTinJSONLD ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD(self): -> result = self.__generic(askQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:451: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow(self): -> result = self.__generic(askQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:469: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML(self): -> result = self.__generic(askQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:354: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3(self): -> result = self.__generic(constructQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:513: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML(self): -> result = self.__generic(constructQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:499: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow(self): -> result = self.__generic(constructQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:593: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML(self): -> result = self.__generic(constructQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:485: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3(self): -> result = self.__generic(constructQuery, N3, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:520: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML(self): -> result = self.__generic(constructQuery, RDFXML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:506: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow(self): -> result = self.__generic(constructQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:601: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML(self): -> result = self.__generic(constructQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:492: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3(self): -> result = self.__generic(describeQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:643: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML(self): -> result = self.__generic(describeQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:629: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow(self): -> result = self.__generic(describeQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:724: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML(self): -> result = self.__generic(describeQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:615: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinN3(self): -> result = self.__generic(describeQuery, N3, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:650: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML(self): -> result = self.__generic(describeQuery, RDFXML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:636: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow(self): -> result = self.__generic(describeQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:732: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML(self): -> result = self.__generic(describeQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:622: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_agrovoc-allegrograph_on_hold.py:757: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) - -test/test_agrovoc-allegrograph_on_hold.py:742: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryDuplicatedPrefix(self): -> result = self.__generic(queryDuplicatedPrefix, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:748: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:745: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:769: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:232: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON(self): -> result = self.__generic(selectQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:260: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:246: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow(self): -> result = self.__generic(selectQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.fao.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:239: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON(self): -> result = self.__generic(selectQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:267: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:253: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow(self): -> result = self.__generic(selectQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:329: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML(self): -> result = self.__generic(selectQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_agrovoc-allegrograph_on_hold.py:224: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_agrovoc-allegrograph_on_hold.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON(self): -> result = self.__generic(askQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:572: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected(self): -> result = self.__generic(askQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:647: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:658: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:579: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected(self): -> result = self.__generic(askQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:603: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:614: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow(self): -> result = self.__generic(askQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:689: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:698: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML(self): -> result = self.__generic(askQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:460: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:468: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON(self): -> result = self.__generic(askQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:586: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD_Unexpected(self): -> result = self.__generic(askQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:669: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '515', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:680: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:593: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinN3_Unexpected(self): -> result = self.__generic(askQuery, N3, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:625: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '475', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:636: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow(self): -> result = self.__generic(askQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:707: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow_Conneg(self): -> result = self.__generic(askQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:716: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML(self): -> result = self.__generic(askQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:476: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '478', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML_Conneg(self): -> result = self.__generic(askQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:484: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '444', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected(self): -> result = self.__generic(constructQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:885: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:894: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected(self): -> result = self.__generic(constructQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:921: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:930: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3(self): -> result = self.__generic(constructQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:823: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:830: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML(self): -> result = self.__generic(constructQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:760: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:768: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow(self): -> result = self.__generic(constructQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:956: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:964: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML(self): -> result = self.__generic(constructQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:731: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:738: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected(self): -> result = self.__generic(constructQuery, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:903: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:912: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected(self): -> result = self.__generic(constructQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:939: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '665', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:948: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3(self): -> result = self.__generic(constructQuery, N3, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:837: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '659', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:844: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML(self): -> result = self.__generic(constructQuery, RDFXML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:776: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '768', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:784: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow(self): -> result = self.__generic(constructQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:972: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:980: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML(self): -> result = self.__generic(constructQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:745: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '662', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:752: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '628', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected(self): -> result = self.__generic(describeQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1148: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1158: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected(self): -> result = self.__generic(describeQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1185: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1194: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3(self): -> result = self.__generic(describeQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1086: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1093: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML(self): -> result = self.__generic(describeQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1023: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1031: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow(self): -> result = self.__generic(describeQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1220: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1228: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML(self): -> result = self.__generic(describeQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:994: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1001: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinCSV_Unexpected(self): -> result = self.__generic(describeQuery, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1167: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1176: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected(self): -> result = self.__generic(describeQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1203: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '468', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1212: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByPOSTinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinN3(self): -> result = self.__generic(describeQuery, N3, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1100: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '462', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1107: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML(self): -> result = self.__generic(describeQuery, RDFXML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1039: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '571', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1047: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow(self): -> result = self.__generic(describeQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1236: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1244: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML(self): -> result = self.__generic(describeQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1008: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '465', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1015: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '431', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_allegrograph__v4_14_1__mmi.py:1269: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) - -test/test_allegrograph__v4_14_1__mmi.py:1254: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryDuplicatedPrefix(self): -> result = self.__generic(queryDuplicatedPrefix, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1260: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1257: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:1281: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:245: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:252: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON(self): -> result = self.__generic(selectQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:301: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected(self): -> result = self.__generic(selectQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:376: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:387: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:308: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected(self): -> result = self.__generic(selectQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:332: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:343: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:273: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:280: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow(self): -> result = self.__generic(selectQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:418: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:427: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML(self): -> result = self.__generic(selectQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:213: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:221: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'mmisw.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:259: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '664', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:266: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON(self): -> result = self.__generic(selectQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:315: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '487', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected(self): -> result = self.__generic(selectQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:398: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:409: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:322: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinN3_Unexpected(self): -> result = self.__generic(selectQuery, N3, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:354: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '481', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:365: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:287: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '764', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:294: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '630', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow(self): -> result = self.__generic(selectQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:436: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:445: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML(self): -> result = self.__generic(selectQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:229: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_allegrograph__v4_14_1__mmi.py:237: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_allegrograph__v4_14_1__mmi.py:189: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '450', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON(self): -> result = self.__generic(askQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:580: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected(self): -> result = self.__generic(askQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:655: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:666: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:587: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected(self): -> result = self.__generic(askQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:611: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:622: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow(self): -> result = self.__generic(askQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:697: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:706: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML(self): -> result = self.__generic(askQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:484: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:492: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON(self): -> result = self.__generic(askQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:594: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '423', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD_Unexpected(self): -> result = self.__generic(askQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:677: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '457', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:688: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:601: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinN3_Unexpected(self): -> result = self.__generic(askQuery, N3, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:633: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '417', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:644: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow(self): -> result = self.__generic(askQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:715: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow_Conneg(self): -> result = self.__generic(askQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:724: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML(self): -> result = self.__generic(askQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:500: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '420', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML_Conneg(self): -> result = self.__generic(askQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:508: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected(self): -> result = self.__generic(constructQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:915: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:924: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:887: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:962: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:849: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML(self): -> result = self.__generic(constructQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:768: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:776: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:811: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow(self): -> result = self.__generic(constructQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:990: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:998: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML(self): -> result = self.__generic(constructQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:739: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:746: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected(self): -> result = self.__generic(constructQuery, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:933: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:942: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:906: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:982: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:868: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinRDFXML ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML(self): -> result = self.__generic(constructQuery, RDFXML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:784: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '755', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:792: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:830: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow(self): -> result = self.__generic(constructQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1006: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1014: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML(self): -> result = self.__generic(constructQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:753: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '649', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:760: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '615', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected(self): -> result = self.__generic(describeQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1200: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1210: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1174: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1248: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1138: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML(self): -> result = self.__generic(describeQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1057: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1065: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1100: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow(self): -> result = self.__generic(describeQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1276: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1284: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML(self): -> result = self.__generic(describeQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1028: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1035: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinCSV_Unexpected(self): -> result = self.__generic(describeQuery, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1219: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1228: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1191: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1268: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1157: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML(self): -> result = self.__generic(describeQuery, RDFXML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1073: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1081: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1119: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow(self): -> result = self.__generic(describeQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1292: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1300: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML(self): -> result = self.__generic(describeQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1042: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1049: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_blazegraph__wikidata.py:1328: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) - -test/test_blazegraph__wikidata.py:1310: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1313: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_1(self): -> result = self.__generic(queryWithCommaInCurie_1, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1332: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:1341: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:267: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON(self): -> result = self.__generic(selectQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:325: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected(self): -> result = self.__generic(selectQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:400: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:411: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:332: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected(self): -> result = self.__generic(selectQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:356: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:367: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:301: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow(self): -> result = self.__generic(selectQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:442: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:451: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML(self): -> result = self.__generic(selectQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:225: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:233: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'query.wikidata.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:284: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON(self): -> result = self.__generic(selectQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:339: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '442', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected(self): -> result = self.__generic(selectQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:422: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '476', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:433: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:346: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinN3_Unexpected(self): -> result = self.__generic(selectQuery, N3, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:378: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:389: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:318: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '406', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow(self): -> result = self.__generic(selectQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:460: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:469: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML(self): -> result = self.__generic(selectQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:241: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_blazegraph__wikidata.py:249: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_blazegraph__wikidata.py:201: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '405', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperCLIParser_Test.testInvalidFormat _________________ - -self = - - def testInvalidFormat(self): - with self.assertRaises(SystemExit) as cm: - parse_args(["-Q", testquery, "-F", "jjssoonn"]) - - self.assertEqual(cm.exception.code, 2) -> self.assertEqual( - sys.stderr.getvalue().split("\n")[1], - "rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld')", - ) -E AssertionError: "rqw:[65 chars]from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld)" != "rqw:[65 chars]from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rd[28 chars]ld')" -E - rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld) -E + rqw: error: argument -F/--format: invalid choice: 'jjssoonn' (choose from 'json', 'xml', 'turtle', 'n3', 'rdf', 'rdf+xml', 'csv', 'tsv', 'json-ld') -E ? + + + + + + + + + + + + + + + + + + - -test/test_cli.py:79: AssertionError -______________________ SPARQLWrapperCLI_Test.testQueryRDF ______________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryRDF(self): -> main(["-Q", "DESCRIBE ", "-e", endpoint, "-F", "rdf"]) - -test/test_cli.py:249: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperCLI_Test.testQueryTo4store ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryTo4store(self): -> main(["-e", "http://rdf.chise.org/sparql", "-Q", testquery]) - -test/test_cli.py:627: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperCLI_Test.testQueryToAgrovoc_AllegroGraph _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToAgrovoc_AllegroGraph(self): -> main(["-e", "https://agrovoc.fao.org/sparql", "-Q", testquery]) - -test/test_cli.py:459: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperCLI_Test.testQueryToAllegroGraph _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToAllegroGraph(self): -> main(["-e", "https://mmisw.org/sparql", "-Q", testquery]) - -test/test_cli.py:378: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperCLI_Test.testQueryToBrazeGraph __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToBrazeGraph(self): -> main(["-e", "https://query.wikidata.org/sparql", "-Q", testquery]) - -test/test_cli.py:546: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_6 _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToFuseki2V3_6(self): -> main(["-e", "https://agrovoc.uniroma2.it/sparql/", "-Q", testquery]) - -test/test_cli.py:573: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperCLI_Test.testQueryToFuseki2V3_8 _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToFuseki2V3_8(self): -> main(["-e", "http://zbw.eu/beta/sparql/stw/query", "-Q", testquery]) - -test/test_cli.py:600: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperCLI_Test.testQueryToGraphDBEnterprise ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToGraphDBEnterprise(self): -> main(["-e", "http://factforge.net/repositories/ff-news", "-Q", testquery]) - -test/test_cli.py:405: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperCLI_Test.testQueryToLovFuseki __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToLovFuseki(self): -> main(["-e", "https://lov.linkeddata.es/dataset/lov/sparql/", "-Q", testquery]) - -test/test_cli.py:317: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperCLI_Test.testQueryToRDF4J ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToRDF4J(self): -> main( - [ - "-e", - "http://vocabs.ands.org.au/repository/api/sparql/csiro_international-chronostratigraphic-chart_2018-revised-corrected", - "-Q", - testquery, - ] - ) - -test/test_cli.py:344: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperCLI_Test.testQueryToStardog ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToStardog(self): -> main(["-e", "https://lindas.admin.ch/query", "-Q", testquery, "-m", POST]) - -test/test_cli.py:432: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '102', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV7 __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToVirtuosoV7(self): -> main(["-e", "http://dbpedia.org/sparql", "-Q", testquery]) - -test/test_cli.py:516: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperCLI_Test.testQueryToVirtuosoV8 __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryToVirtuosoV8(self): -> main(["-e", "http://dbpedia-live.openlinksw.com/sparql", "-Q", testquery]) - -test/test_cli.py:486: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperCLI_Test.testQueryWithEndpoint __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithEndpoint(self): -> main( - [ - "-Q", - testquery, - "-e", - endpoint, - ] - ) - -test/test_cli.py:97: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperCLI_Test.testQueryWithFile ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithFile(self): -> main(["-f", testfile, "-e", endpoint]) - -test/test_cli.py:135: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'c...lla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperCLI_Test.testQueryWithFileCSV __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithFileCSV(self): -> main(["-f", testfile, "-e", endpoint, "-F", "csv"]) - -test/test_cli.py:291: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperCLI_Test.testQueryWithFileN3 ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithFileN3(self): -> main(["-f", testfile, "-e", endpoint, "-F", "n3"]) - -test/test_cli.py:232: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperCLI_Test.testQueryWithFileRDFXML _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithFileRDFXML(self): -> main(["-f", testfile, "-e", endpoint, "-F", "rdf+xml"]) - -test/test_cli.py:270: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperCLI_Test.testQueryWithFileTSV __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithFileTSV(self): -> main(["-f", testfile, "-e", endpoint, "-F", "tsv"]) - -test/test_cli.py:304: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperCLI_Test.testQueryWithFileTurtle _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithFileTurtle(self): -> main(["-f", testfile, "-e", endpoint, "-F", "turtle"]) - -test/test_cli.py:188: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperCLI_Test.testQueryWithFileTurtleQuiet ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithFileTurtleQuiet(self): -> main( - [ - "-f", - testfile, - "-e", - endpoint, - "-F", - "turtle", - "-q", - ] - ) - -test/test_cli.py:205: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperCLI_Test.testQueryWithFileXML __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithFileXML(self): -> main(["-f", testfile, "-e", endpoint, "-F", "xml"]) - -test/test_cli.py:167: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/main.py:137: in main - results = sparql.query().convert() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'ja.dbpedia.org', 'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/96.0.4664.110 Safari/537.36'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV(self): -> result = self.__generic(askQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:489: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV_Conneg(self): -> result = self.__generic(askQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:496: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON(self): -> result = self.__generic(askQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:545: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:629: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:552: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:587: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV(self): -> result = self.__generic(askQuery, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:517: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV_Conneg(self): -> result = self.__generic(askQuery, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:524: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow(self): -> result = self.__generic(askQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:658: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:667: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML(self): -> result = self.__generic(askQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:457: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:465: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinCSV(self): -> result = self.__generic(askQuery, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:503: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinCSV_Conneg(self): -> result = self.__generic(askQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:510: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON(self): -> result = self.__generic(askQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:559: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '339', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:650: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:566: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:608: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinTSV(self): -> result = self.__generic(askQuery, TSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:531: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '436', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinTSV_Conneg(self): -> result = self.__generic(askQuery, TSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:538: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow(self): -> result = self.__generic(askQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:676: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow_Conneg(self): -> result = self.__generic(askQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:685: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML(self): -> result = self.__generic(askQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:473: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '336', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML_Conneg(self): -> result = self.__generic(askQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:481: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '302', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:874: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD(self): -> result = self.__generic(constructQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:831: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:839: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected(self): -> result = self.__generic(constructQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:901: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:910: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:806: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:738: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:772: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow(self): -> result = self.__generic(constructQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:935: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:943: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML(self): -> result = self.__generic(constructQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:700: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:707: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:893: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSONLD(self): -> result = self.__generic(constructQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:847: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '702', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:855: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected(self): -> result = self.__generic(constructQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:918: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '527', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:927: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:823: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:755: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:789: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow(self): -> result = self.__generic(constructQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:951: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:959: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML(self): -> result = self.__generic(constructQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:714: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '524', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:721: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '490', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1143: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD(self): -> result = self.__generic(describeQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1103: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1110: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected(self): -> result = self.__generic(describeQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1170: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1179: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1079: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1011: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1045: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow(self): -> result = self.__generic(describeQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1204: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1212: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML(self): -> result = self.__generic(describeQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:973: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:980: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1162: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSONLD(self): -> result = self.__generic(describeQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1117: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '501', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1124: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected(self): -> result = self.__generic(describeQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1187: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '326', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1196: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1096: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1028: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1062: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow(self): -> result = self.__generic(describeQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1220: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1228: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML(self): -> result = self.__generic(describeQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:987: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '323', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:994: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '289', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_fuseki2__v3_6_0__agrovoc.py:1253: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) - -test/test_fuseki2__v3_6_0__agrovoc.py:1238: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryDuplicatedPrefix(self): -> result = self.__generic(queryDuplicatedPrefix, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1244: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1241: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_1(self): -> result = self.__generic(queryWithCommaInCurie_1, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1257: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:1266: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:246: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:253: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON(self): -> result = self.__generic(selectQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:302: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:386: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:309: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:344: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:274: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:281: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow(self): -> result = self.__generic(selectQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:415: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:424: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML(self): -> result = self.__generic(selectQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:214: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:222: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'agrovoc.uniroma2.it', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:260: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '466', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:267: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON(self): -> result = self.__generic(selectQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:316: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '393', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:407: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:323: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:365: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:288: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '566', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:295: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '432', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow(self): -> result = self.__generic(selectQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:433: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:442: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML(self): -> result = self.__generic(selectQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:230: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '390', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_6_0__agrovoc.py:238: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_6_0__agrovoc.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '356', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV(self): -> result = self.__generic(askQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:493: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV_Conneg(self): -> result = self.__generic(askQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:500: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON(self): -> result = self.__generic(askQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:549: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:633: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:556: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:591: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV(self): -> result = self.__generic(askQuery, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:521: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV_Conneg(self): -> result = self.__generic(askQuery, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:528: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow(self): -> result = self.__generic(askQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:662: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:671: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML(self): -> result = self.__generic(askQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:461: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:469: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinCSV _____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinCSV(self): -> result = self.__generic(askQuery, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:507: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinCSV_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinCSV_Conneg(self): -> result = self.__generic(askQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:514: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinJSON ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON(self): -> result = self.__generic(askQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:563: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '333', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:654: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:570: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:612: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinTSV _____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinTSV(self): -> result = self.__generic(askQuery, TSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:535: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '430', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinTSV_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinTSV_Conneg(self): -> result = self.__generic(askQuery, TSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:542: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByPOSTinUnknow ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow(self): -> result = self.__generic(askQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:680: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow_Conneg(self): -> result = self.__generic(askQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:689: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByPOSTinXML _____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML(self): -> result = self.__generic(askQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:477: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '330', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML_Conneg(self): -> result = self.__generic(askQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:485: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '296', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:878: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD(self): -> result = self.__generic(constructQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:835: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:843: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected(self): -> result = self.__generic(constructQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:905: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:914: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:810: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:742: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:776: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow(self): -> result = self.__generic(constructQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:939: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:947: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinXML __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML(self): -> result = self.__generic(constructQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:704: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:711: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:897: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinJSONLD ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSONLD(self): -> result = self.__generic(constructQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:851: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '696', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:859: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected ____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected(self): -> result = self.__generic(constructQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:922: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '521', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:931: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:827: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:759: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:793: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByPOSTinUnknow ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow(self): -> result = self.__generic(constructQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:955: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:963: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testConstructByPOSTinXML __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML(self): -> result = self.__generic(constructQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:718: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '518', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:725: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '484', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1147: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD(self): -> result = self.__generic(describeQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1107: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1114: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected(self): -> result = self.__generic(describeQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1174: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1183: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1083: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1015: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1049: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow(self): -> result = self.__generic(describeQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1208: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1216: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML(self): -> result = self.__generic(describeQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:977: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:984: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1166: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSONLD(self): -> result = self.__generic(describeQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1121: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '495', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1128: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected ____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected(self): -> result = self.__generic(describeQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1191: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '320', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1200: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1100: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1032: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1066: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testDescribeByPOSTinUnknow _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow(self): -> result = self.__generic(describeQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1224: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1232: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByPOSTinXML __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML(self): -> result = self.__generic(describeQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:991: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '317', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:998: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '283', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_fuseki2__v3_8_0__stw.py:1257: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) - -test/test_fuseki2__v3_8_0__stw.py:1242: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryDuplicatedPrefix(self): -> result = self.__generic(queryDuplicatedPrefix, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1248: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1245: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_1(self): -> result = self.__generic(queryWithCommaInCurie_1, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1261: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:1270: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:250: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:257: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON(self): -> result = self.__generic(selectQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:306: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:390: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:313: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:348: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:278: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:285: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow(self): -> result = self.__generic(selectQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:419: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:428: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML(self): -> result = self.__generic(selectQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:218: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:226: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'zbw.eu', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinCSV ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:264: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:271: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByPOSTinJSON ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON(self): -> result = self.__generic(selectQuery, JSON, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:320: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '387', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:411: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:327: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:369: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinTSV ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:292: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '537', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:299: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '403', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testSelectByPOSTinUnknow __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow(self): -> result = self.__generic(selectQuery, "bar", POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:437: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:446: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByPOSTinXML ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML(self): -> result = self.__generic(selectQuery, XML, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:234: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '384', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_fuseki2__v3_8_0__stw.py:242: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_fuseki2__v3_8_0__stw.py:194: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '350', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:663: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:585: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:621: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:702: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:488: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:684: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:600: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:642: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow_Conneg(self): -> result = self.__generic(askQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:721: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML_Conneg(self): -> result = self.__generic(askQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:505: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '437', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:898: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:864: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:936: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:834: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:774: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:804: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:972: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:744: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:917: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:879: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:955: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:849: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:789: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:819: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:989: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:759: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '573', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1166: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1132: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1204: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1102: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1042: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1072: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1240: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1012: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1185: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1147: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1223: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1117: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1057: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1087: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1257: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1027: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '424', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_graphdbEnterprise__v8_9_0__rs.py:1286: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) - -test/test_graphdbEnterprise__v8_9_0__rs.py:1268: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1271: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_1(self): -> result = self.__generic(queryWithCommaInCurie_1, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1290: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:1298: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:269: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:407: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:329: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:365: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:299: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:445: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:236: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'factforge.net', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:284: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:428: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:344: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:386: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:314: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:464: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_graphdbEnterprise__v8_9_0__rs.py:253: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_graphdbEnterprise__v8_9_0__rs.py:203: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '677', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV(self): -> result = self.__generic(askQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:536: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV_Conneg(self): -> result = self.__generic(askQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:543: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON(self): -> result = self.__generic(askQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:604: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected(self): -> result = self.__generic(askQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:687: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:697: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:611: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinN3_Unexpected ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected(self): -> result = self.__generic(askQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:641: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:651: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV(self): -> result = self.__generic(askQuery, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:570: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV_Conneg(self): -> result = self.__generic(askQuery, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:577: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow(self): -> result = self.__generic(askQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:731: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:740: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML(self): -> result = self.__generic(askQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:498: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:506: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected(self): -> result = self.__generic(constructQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:967: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:976: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD(self): -> result = self.__generic(constructQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:928: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:936: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected(self): -> result = self.__generic(constructQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1010: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1019: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3(self): -> result = self.__generic(constructQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:890: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:898: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML(self): -> result = self.__generic(constructQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:814: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:822: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE(self): -> result = self.__generic(constructQuery, TURTLE, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:852: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:860: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow(self): -> result = self.__generic(constructQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1052: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1060: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML(self): -> result = self.__generic(constructQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:779: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:786: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected(self): -> result = self.__generic(describeQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1280: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1289: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD(self): -> result = self.__generic(describeQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1244: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1251: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected(self): -> result = self.__generic(describeQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1323: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1332: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3(self): -> result = self.__generic(describeQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1207: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1215: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML(self): -> result = self.__generic(describeQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1131: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1139: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE(self): -> result = self.__generic(describeQuery, TURTLE, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1169: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1177: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow(self): -> result = self.__generic(describeQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1365: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1373: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML(self): -> result = self.__generic(describeQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1096: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1103: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_lov-fuseki_on_hold.py:1423: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryDuplicatedPrefix(self): -> result = self.__generic(queryDuplicatedPrefix, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1414: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1411: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:1443: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:255: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:262: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON(self): -> result = self.__generic(selectQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:323: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected(self): -> result = self.__generic(selectQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:406: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:416: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:330: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected(self): -> result = self.__generic(selectQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:360: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:370: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:289: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:296: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow(self): -> result = self.__generic(selectQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:450: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:459: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML(self): -> result = self.__generic(selectQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:217: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_lov-fuseki_on_hold.py:225: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_lov-fuseki_on_hold.py:193: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lov.linkeddata.es', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:674: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:591: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:628: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:716: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:494: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:697: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:606: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:651: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow_Conneg(self): -> result = self.__generic(askQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:735: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML_Conneg(self): -> result = self.__generic(askQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:511: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '238', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:912: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:878: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:950: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:848: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:788: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:818: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:986: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:758: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:931: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:893: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:969: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:863: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:803: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:833: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1003: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:773: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '418', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1180: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1146: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1218: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1116: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1056: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1086: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1254: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1025: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1199: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1161: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1237: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1131: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1071: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1101: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1271: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1041: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '225', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_rdf4j__geosciml.py:1305: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryBadFormed_1 ____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed_1(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed_1, XML, GET) - -test/test_rdf4j__geosciml.py:1282: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1289: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_1(self): -> result = self.__generic(queryWithCommaInCurie_1, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1309: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:1317: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:266: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:409: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:326: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:363: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:296: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:451: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:233: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'vocabs.ands.org.au', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:281: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:432: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:341: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:386: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:311: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:470: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_rdf4j__geosciml.py:250: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_rdf4j__geosciml.py:200: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '376', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:678: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:595: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:632: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:720: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:498: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByPOSTinJSONLD_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:701: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:610: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByPOSTinN3_Unexpected_Conneg ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:655: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testAskByPOSTinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinUnknow_Conneg(self): -> result = self.__generic(askQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:739: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByPOSTinXML_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinXML_Conneg(self): -> result = self.__generic(askQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:515: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '162', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:916: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:882: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:954: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:852: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:792: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:822: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:990: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:762: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:935: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:897: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinJSON_Unexpected_Conneg ________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:973: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:867: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:807: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:837: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1007: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByPOSTinXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:777: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '342', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1183: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1149: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1221: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1119: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1059: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1089: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1257: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1029: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1202: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1164: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testDescribeByPOSTinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1240: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByPOSTinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1134: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1074: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1104: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1274: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testDescribeByPOSTinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByPOSTinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1044: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '149', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_stardog__lindas.py:1307: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryDuplicatedPrefix(self): -> result = self.__generic(queryDuplicatedPrefix, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1298: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1293: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_1(self): -> result = self.__generic(queryWithCommaInCurie_1, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1311: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:270: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:413: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:330: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:367: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:300: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:455: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:237: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'lindas.admin.ch', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:285: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:436: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinJSON_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:345: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testSelectByPOSTinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:390: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:315: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByPOSTinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:474: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByPOSTinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_stardog__lindas.py:254: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_stardog__lindas.py:204: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Content-Length': '386', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV(self): -> result = self.__generic(askQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:520: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV_Conneg(self): -> result = self.__generic(askQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:527: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON(self): -> result = self.__generic(askQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:583: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________ SPARQLWrapperTests.testAskByGETinJSONLD_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(askQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:673: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:590: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testAskByGETinN3_Unexpected_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(askQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:627: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV_Conneg(self): -> result = self.__generic(askQuery, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:560: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:718: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:494: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:942: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:981: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:872: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:797: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:834: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:1020: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:763: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:1246: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinJSON_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSON_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:1287: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:1097: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:1326: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:1062: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_store__v1_1_4.py:1371: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) - -test/test_store__v1_1_4.py:1356: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryDuplicatedPrefix(self): -> result = self.__generic(queryDuplicatedPrefix, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:1362: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:1359: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:1387: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:247: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:254: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON(self): -> result = self.__generic(selectQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:310: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:403: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:317: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________ SPARQLWrapperTests.testSelectByGETinN3_Unexpected_Conneg ___________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinN3_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:357: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:287: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:448: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_store__v1_1_4.py:221: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_store__v1_1_4.py:188: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1348: in http_open - return self.do_open(http.client.HTTPConnection, req) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = , http_conn_args = {} -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'rdf.chise.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV(self): -> result = self.__generic(askQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:526: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV_Conneg(self): -> result = self.__generic(askQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:533: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON(self): -> result = self.__generic(askQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:586: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:593: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV(self): -> result = self.__generic(askQuery, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:556: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV_Conneg(self): -> result = self.__generic(askQuery, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:563: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow(self): -> result = self.__generic(askQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:728: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:737: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML(self): -> result = self.__generic(askQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:492: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:500: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByPOSTinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByPOSTinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:608: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Content-Length': '228', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:950: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD(self): -> result = self.__generic(constructQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:895: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:904: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3(self): -> result = self.__generic(constructQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:866: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:873: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML(self): -> result = self.__generic(constructQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:802: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:809: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE(self): -> result = self.__generic(constructQuery, TURTLE, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:833: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:841: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow(self): -> result = self.__generic(constructQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1048: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1056: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML(self): -> result = self.__generic(constructQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:772: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:779: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testConstructByPOSTinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:977: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByPOSTinN3 __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinN3(self): -> result = self.__generic(constructQuery, N3, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:880: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Content-Length': '439', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testConstructByPOSTinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByPOSTinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "bar", POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1073: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Content-Length': '408', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1266: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD(self): -> result = self.__generic(describeQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1211: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1220: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3(self): -> result = self.__generic(describeQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1181: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1188: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML(self): -> result = self.__generic(describeQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1117: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1124: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE(self): -> result = self.__generic(describeQuery, TURTLE, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1148: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1156: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow(self): -> result = self.__generic(describeQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1364: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1372: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML(self): -> result = self.__generic(describeQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1087: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1094: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_virtuoso__v7_20_3230__dbpedia.py:1416: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) - -test/test_virtuoso__v7_20_3230__dbpedia.py:1401: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryDuplicatedPrefix(self): -> result = self.__generic(queryDuplicatedPrefix, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1407: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1404: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:1428: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:248: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:255: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON(self): -> result = self.__generic(selectQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:308: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected(self): -> result = self.__generic(selectQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:406: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:417: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:315: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:278: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:285: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow(self): -> result = self.__generic(selectQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:448: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:457: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML(self): -> result = self.__generic(selectQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:214: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:222: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected ____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected(self): -> result = self.__generic(selectQuery, JSONLD, POST) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:428: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '349', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________ SPARQLWrapperTests.testSelectByPOSTinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByPOSTinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, POST, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v7_20_3230__dbpedia.py:439: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v7_20_3230__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Content-Length': '278', 'Content-Type': 'application/x-www-form-urlencoded', ...} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinCSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV(self): -> result = self.__generic(askQuery, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:528: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinCSV_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinCSV_Conneg(self): -> result = self.__generic(askQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:535: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testAskByGETinJSON _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON(self): -> result = self.__generic(askQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:588: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinJSON_Conneg _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinJSON_Conneg(self): -> result = self.__generic(askQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:595: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinTSV _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV(self): -> result = self.__generic(askQuery, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:558: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinTSV_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinTSV_Conneg(self): -> result = self.__generic(askQuery, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:565: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testAskByGETinUnknow ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow(self): -> result = self.__generic(askQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:731: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testAskByGETinUnknow_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinUnknow_Conneg(self): -> result = self.__generic(askQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:740: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________________ SPARQLWrapperTests.testAskByGETinXML _____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML(self): -> result = self.__generic(askQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:494: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testAskByGETinXML_Conneg __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testAskByGETinXML_Conneg(self): -> result = self.__generic(askQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:502: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testConstructByGETinCSV_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(constructQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:954: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinJSONLD _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD(self): -> result = self.__generic(constructQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:899: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinJSONLD_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinJSONLD_Conneg(self): -> result = self.__generic(constructQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:908: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3(self): -> result = self.__generic(constructQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:869: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testConstructByGETinN3_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinN3_Conneg(self): -> result = self.__generic(constructQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:876: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML(self): -> result = self.__generic(constructQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:805: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinRDFXML_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinRDFXML_Conneg(self): -> result = self.__generic(constructQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:812: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinTURTLE _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE(self): -> result = self.__generic(constructQuery, TURTLE, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:836: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinTURTLE_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinTURTLE_Conneg(self): -> result = self.__generic(constructQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:844: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testConstructByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow(self): -> result = self.__generic(constructQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1053: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testConstructByGETinUnknow_Conneg _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinUnknow_Conneg(self): -> result = self.__generic(constructQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1061: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testConstructByGETinXML __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML(self): -> result = self.__generic(constructQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:775: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testConstructByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testConstructByGETinXML_Conneg(self): -> result = self.__generic(constructQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:782: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testDescribeByGETinCSV_Unexpected_Conneg __________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinCSV_Unexpected_Conneg(self): -> result = self.__generic(describeQuery, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1272: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinJSONLD _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD(self): -> result = self.__generic(describeQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1217: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinJSONLD_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinJSONLD_Conneg(self): -> result = self.__generic(describeQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1226: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/ld+json,application/x-json+ld', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testDescribeByGETinN3 ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3(self): -> result = self.__generic(describeQuery, N3, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1187: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinN3_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinN3_Conneg(self): -> result = self.__generic(describeQuery, N3, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1194: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle,text/rdf+n3,application/n-triples,application/n3,text/n3', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinRDFXML _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML(self): -> result = self.__generic(describeQuery, RDFXML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1123: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinRDFXML_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinRDFXML_Conneg(self): -> result = self.__generic(describeQuery, RDFXML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1130: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinTURTLE _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE(self): -> result = self.__generic(describeQuery, TURTLE, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1154: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinTURTLE_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinTURTLE_Conneg(self): -> result = self.__generic(describeQuery, TURTLE, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1162: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/turtle,text/turtle', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testDescribeByGETinUnknow _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow(self): -> result = self.__generic(describeQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1370: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_____________ SPARQLWrapperTests.testDescribeByGETinUnknow_Conneg ______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinUnknow_Conneg(self): -> result = self.__generic(describeQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1378: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testDescribeByGETinXML ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML(self): -> result = self.__generic(describeQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1093: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testDescribeByGETinXML_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testDescribeByGETinXML_Conneg(self): -> result = self.__generic(describeQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1100: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/rdf+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________________ SPARQLWrapperTests.testKeepAlive _______________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testKeepAlive(self): - sparql = SPARQLWrapper(endpoint) - sparql.setQuery("SELECT * WHERE {?s ?p ?o} LIMIT 10") - sparql.setReturnFormat(JSON) - sparql.setMethod(GET) - sparql.setUseKeepAlive() - -> sparql.query() - -test/test_virtuoso__v8_03_3313__dbpedia.py:1422: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________________ SPARQLWrapperTests.testQueryBadFormed _____________________ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryBadFormed(self): -> self.assertRaises(QueryBadFormed, self.__generic, queryBadFormed, XML, GET) - -test/test_virtuoso__v8_03_3313__dbpedia.py:1407: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________ SPARQLWrapperTests.testQueryDuplicatedPrefix _________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryDuplicatedPrefix(self): -> result = self.__generic(queryDuplicatedPrefix, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1413: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryManyPrefixes ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryManyPrefixes(self): -> result = self.__generic(queryManyPrefixes, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1410: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_1 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_1(self): -> result = self.__generic(queryWithCommaInCurie_1, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1426: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testQueryWithComma_3 ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testQueryWithComma_3(self): -> result = self.__generic(queryWithCommaInUri, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:1433: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinCSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:248: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinCSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinCSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, CSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:255: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/csv', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinJSON ___________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON(self): -> result = self.__generic(selectQuery, JSON, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:308: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -____________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected _____________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected(self): -> result = self.__generic(selectQuery, JSONLD, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:406: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________ SPARQLWrapperTests.testSelectByGETinJSONLD_Unexpected_Conneg _________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSONLD_Unexpected_Conneg(self): -> result = self.__generic(selectQuery, JSONLD, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:417: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': '*/*', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_______________ SPARQLWrapperTests.testSelectByGETinJSON_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinJSON_Conneg(self): -> result = self.__generic(selectQuery, JSON, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:315: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+json,application/json,text/javascript,application/javascript', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinTSV ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:278: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinTSV_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinTSV_Conneg(self): -> result = self.__generic(selectQueryCSV_TSV, TSV, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:285: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'text/tab-separated-values', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -__________________ SPARQLWrapperTests.testSelectByGETinUnknow __________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow(self): -> result = self.__generic(selectQuery, "foo", GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:450: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -______________ SPARQLWrapperTests.testSelectByGETinUnknow_Conneg _______________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinUnknow_Conneg(self): -> result = self.__generic(selectQuery, "foo", GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:459: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -___________________ SPARQLWrapperTests.testSelectByGETinXML ____________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML(self): -> result = self.__generic(selectQuery, XML, GET) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:214: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -________________ SPARQLWrapperTests.testSelectByGETinXML_Conneg ________________ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: -> h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - -/usr/lib/python3.13/urllib/request.py:1319: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -/usr/lib/python3.13/http/client.py:1338: in request - self._send_request(method, url, body, headers, encode_chunked) -/usr/lib/python3.13/http/client.py:1384: in _send_request - self.endheaders(body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1333: in endheaders - self._send_output(message_body, encode_chunked=encode_chunked) -/usr/lib/python3.13/http/client.py:1093: in _send_output - self.send(msg) -/usr/lib/python3.13/http/client.py:1037: in send - self.connect() -/usr/lib/python3.13/http/client.py:1472: in connect - super().connect() -/usr/lib/python3.13/http/client.py:1003: in connect - self.sock = self._create_connection( -/usr/lib/python3.13/socket.py:864: in create_connection - raise exceptions[0] -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -address = ('127.0.0.1', 9), timeout = -source_address = None - - def create_connection(address, timeout=_GLOBAL_DEFAULT_TIMEOUT, - source_address=None, *, all_errors=False): - """Connect to *address* and return the socket object. - - Convenience function. Connect to *address* (a 2-tuple ``(host, - port)``) and return the socket object. Passing the optional - *timeout* parameter will set the timeout on the socket instance - before attempting to connect. If no *timeout* is supplied, the - global default timeout setting returned by :func:`getdefaulttimeout` - is used. If *source_address* is set it must be a tuple of (host, port) - for the socket to bind as a source address before making the connection. - A host of '' or port 0 tells the OS to use the default. When a connection - cannot be created, raises the last error if *all_errors* is False, - and an ExceptionGroup of all errors if *all_errors* is True. - """ - - host, port = address - exceptions = [] - for res in getaddrinfo(host, port, 0, SOCK_STREAM): - af, socktype, proto, canonname, sa = res - sock = None - try: - sock = socket(af, socktype, proto) - if timeout is not _GLOBAL_DEFAULT_TIMEOUT: - sock.settimeout(timeout) - if source_address: - sock.bind(source_address) -> sock.connect(sa) -E ConnectionRefusedError: [Errno 111] Connection refused - -/usr/lib/python3.13/socket.py:849: ConnectionRefusedError - -During handling of the above exception, another exception occurred: - -self = - - def testSelectByGETinXML_Conneg(self): -> result = self.__generic(selectQuery, XML, GET, onlyConneg=True) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ - -test/test_virtuoso__v8_03_3313__dbpedia.py:222: -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ -test/test_virtuoso__v8_03_3313__dbpedia.py:190: in __generic - result = sparql.query() - ^^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:960: in query - return QueryResult(self._query()) - ^^^^^^^^^^^^^ -SPARQLWrapper/Wrapper.py:926: in _query - response = urlopener(request) - ^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:189: in urlopen - return opener.open(url, data, timeout) - ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:489: in open - response = self._open(req, data) - ^^^^^^^^^^^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:506: in _open - result = self._call_chain(self.handle_open, protocol, protocol + -/usr/lib/python3.13/urllib/request.py:466: in _call_chain - result = func(*args) - ^^^^^^^^^^^ -/usr/lib/python3.13/urllib/request.py:1367: in https_open - return self.do_open(http.client.HTTPSConnection, req, -_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ - -self = -http_class = -req = -http_conn_args = {'context': } -host = '127.0.0.1:9', h = -headers = {'Accept': 'application/sparql-results+xml', 'Connection': 'close', 'Host': 'live.dbpedia.org', 'User-Agent': 'sparqlwrapper 2.0.0 (rdflib.github.io/sparqlwrapper)'} - - def do_open(self, http_class, req, **http_conn_args): - """Return an HTTPResponse object for the request, using http_class. - - http_class must implement the HTTPConnection API from http.client. - """ - host = req.host - if not host: - raise URLError('no host given') - - # will parse host:port - h = http_class(host, timeout=req.timeout, **http_conn_args) - h.set_debuglevel(self._debuglevel) - - headers = dict(req.unredirected_hdrs) - headers.update({k: v for k, v in req.headers.items() - if k not in headers}) - - # TODO(jhylton): Should this be redesigned to handle - # persistent connections? - - # We want to make an HTTP/1.1 request, but the addinfourl - # class isn't prepared to deal with a persistent connection. - # It will try to read all remaining data from the socket, - # which will block while the server waits for the next request. - # So make sure the connection gets closed after the (only) - # request. - headers["Connection"] = "close" - headers = {name.title(): val for name, val in headers.items()} - - if req._tunnel_host: - tunnel_headers = {} - proxy_auth_hdr = "Proxy-Authorization" - if proxy_auth_hdr in headers: - tunnel_headers[proxy_auth_hdr] = headers[proxy_auth_hdr] - # Proxy-Authorization should not be sent to origin - # server. - del headers[proxy_auth_hdr] - h.set_tunnel(req._tunnel_host, headers=tunnel_headers) - - try: - try: - h.request(req.get_method(), req.selector, req.data, headers, - encode_chunked=req.has_header('Transfer-encoding')) - except OSError as err: # timeout error -> raise URLError(err) -E urllib.error.URLError: - -/usr/lib/python3.13/urllib/request.py:1322: URLError -_________________________ QueryResult_Test.testConvert _________________________ - -self = - - def testConvert(self): - class FakeResponse(object): - def __init__(self, content_type): - self.content_type = content_type - - def info(self): - return {"content-type": self.content_type} - - def read(self, len): - return "" - - def _mime_vs_type(mime, requested_type): - """ - :param mime: mimetype/Content-Type of the response - :param requested_type: requested mimetype (alias) - :return: number of warnings produced by combo - """ - with warnings.catch_warnings(record=True) as w: - qr = QueryResult((FakeResponse(mime), requested_type)) - - try: - qr.convert() - except: - pass - - # if len(w) > 0: print(w[0].message) # FOR DEBUG - # if len(w) > 1: print(w[1].message) # FOR DEBUG - - return len(w) - - # In the cases of "application/ld+json" and "application/rdf+xml", the - # RDFLib raised a warning because the manually created QueryResult has no real - # response value (implemented a fake read). - # "WARNING:rdflib.term: does not look like a valid URI, trying to serialize this will break." - self.assertEqual(0, _mime_vs_type("application/sparql-results+xml", XML)) - self.assertEqual(0, _mime_vs_type("application/sparql-results+json", JSON)) - self.assertEqual(0, _mime_vs_type("text/n3", N3)) - self.assertEqual(0, _mime_vs_type("text/turtle", TURTLE)) - self.assertEqual(0, _mime_vs_type("application/turtle", TURTLE)) - self.assertEqual(0, _mime_vs_type("application/json", JSON)) - -> self.assertEqual(0, _mime_vs_type("application/ld+json", JSONLD)) -E AssertionError: 0 != 1 - -test/test_wrapper.py:876: AssertionError -=============================== warnings summary =============================== -test/test_agrovoc-allegrograph_on_hold.py:167 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_agrovoc-allegrograph_on_hold.py:167: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_allegrograph__v4_14_1__mmi.py:166 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_allegrograph__v4_14_1__mmi.py:166: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_blazegraph__wikidata.py:175 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_blazegraph__wikidata.py:175: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_fuseki2__v3_6_0__agrovoc.py:167 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_fuseki2__v3_6_0__agrovoc.py:167: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_fuseki2__v3_8_0__stw.py:168 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_fuseki2__v3_8_0__stw.py:168: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_graphdbEnterprise__v8_9_0__rs.py:179 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_graphdbEnterprise__v8_9_0__rs.py:179: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_lov-fuseki_on_hold.py:170 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_lov-fuseki_on_hold.py:170: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_rdf4j__geosciml.py:176 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_rdf4j__geosciml.py:176: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_stardog__lindas.py:180 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_stardog__lindas.py:180: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_store__v1_1_4.py:165 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_store__v1_1_4.py:165: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_virtuoso__v7_20_3230__dbpedia.py:167 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_virtuoso__v7_20_3230__dbpedia.py:167: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_virtuoso__v8_03_3313__dbpedia.py:167 - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/test/test_virtuoso__v8_03_3313__dbpedia.py:167: SyntaxWarning: invalid escape sequence '\:' - ?article ?relation dbpedia:Category\:Victoria\,\_British\_Columbia . - -test/test_agrovoc-allegrograph_on_hold.py: 2 warnings -test/test_allegrograph__v4_14_1__mmi.py: 4 warnings -test/test_blazegraph__wikidata.py: 4 warnings -test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings -test/test_fuseki2__v3_8_0__stw.py: 2 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings -test/test_lov-fuseki_on_hold.py: 2 warnings -test/test_rdf4j__geosciml.py: 2 warnings -test/test_stardog__lindas.py: 2 warnings -test/test_store__v1_1_4.py: 3 warnings - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'ASK' SPARQL query form - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 2 warnings -test/test_allegrograph__v4_14_1__mmi.py: 4 warnings -test/test_blazegraph__wikidata.py: 4 warnings -test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings -test/test_fuseki2__v3_8_0__stw.py: 2 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings -test/test_lov-fuseki_on_hold.py: 2 warnings -test/test_rdf4j__geosciml.py: 2 warnings -test/test_stardog__lindas.py: 2 warnings -test/test_store__v1_1_4.py: 3 warnings - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'ASK' SPARQL query form - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 4 warnings -test/test_allegrograph__v4_14_1__mmi.py: 8 warnings -test/test_blazegraph__wikidata.py: 8 warnings -test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings -test/test_fuseki2__v3_8_0__stw.py: 8 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings -test/test_lov-fuseki_on_hold.py: 8 warnings -test/test_rdf4j__geosciml.py: 4 warnings -test/test_stardog__lindas.py: 4 warnings -test/test_store__v1_1_4.py: 8 warnings -test/test_virtuoso__v7_20_3230__dbpedia.py: 8 warnings -test/test_virtuoso__v8_03_3313__dbpedia.py: 8 warnings - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'foo'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 4 warnings -test/test_allegrograph__v4_14_1__mmi.py: 8 warnings -test/test_blazegraph__wikidata.py: 8 warnings -test/test_fuseki2__v3_6_0__agrovoc.py: 8 warnings -test/test_fuseki2__v3_8_0__stw.py: 8 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 4 warnings -test/test_rdf4j__geosciml.py: 4 warnings -test/test_stardog__lindas.py: 4 warnings -test/test_store__v1_1_4.py: 8 warnings -test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:348: SyntaxWarning: Ignore format 'bar'; current instance supports: json, xml, turtle, n3, rdf, rdf+xml, csv, tsv, json-ld. - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 2 warnings -test/test_allegrograph__v4_14_1__mmi.py: 4 warnings -test/test_blazegraph__wikidata.py: 4 warnings -test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings -test/test_fuseki2__v3_8_0__stw.py: 2 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings -test/test_lov-fuseki_on_hold.py: 2 warnings -test/test_rdf4j__geosciml.py: 2 warnings -test/test_stardog__lindas.py: 2 warnings -test/test_store__v1_1_4.py: 3 warnings -test/test_virtuoso__v7_20_3230__dbpedia.py: 2 warnings -test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'CONSTRUCT' SPARQL query form - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 2 warnings -test/test_allegrograph__v4_14_1__mmi.py: 4 warnings -test/test_blazegraph__wikidata.py: 2 warnings -test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings -test/test_fuseki2__v3_8_0__stw.py: 4 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings -test/test_lov-fuseki_on_hold.py: 2 warnings -test/test_rdf4j__geosciml.py: 2 warnings -test/test_stardog__lindas.py: 2 warnings -test/test_store__v1_1_4.py: 3 warnings - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'CONSTRUCT' SPARQL query form - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 2 warnings -test/test_allegrograph__v4_14_1__mmi.py: 4 warnings -test/test_blazegraph__wikidata.py: 4 warnings -test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings -test/test_fuseki2__v3_8_0__stw.py: 2 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings -test/test_lov-fuseki_on_hold.py: 2 warnings -test/test_rdf4j__geosciml.py: 2 warnings -test/test_stardog__lindas.py: 2 warnings -test/test_store__v1_1_4.py: 3 warnings -test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning -test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'csv' in a 'DESCRIBE' SPARQL query form - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 2 warnings -test/test_allegrograph__v4_14_1__mmi.py: 4 warnings -test/test_blazegraph__wikidata.py: 2 warnings -test/test_fuseki2__v3_6_0__agrovoc.py: 4 warnings -test/test_fuseki2__v3_8_0__stw.py: 4 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings -test/test_lov-fuseki_on_hold.py: 2 warnings -test/test_rdf4j__geosciml.py: 2 warnings -test/test_stardog__lindas.py: 2 warnings -test/test_store__v1_1_4.py: 3 warnings - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json' in a 'DESCRIBE' SPARQL query form - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 1 warning -test/test_allegrograph__v4_14_1__mmi.py: 1 warning -test/test_blazegraph__wikidata.py: 1 warning -test/test_fuseki2__v3_6_0__agrovoc.py: 1 warning -test/test_fuseki2__v3_8_0__stw.py: 1 warning -test/test_graphdbEnterprise__v8_9_0__rs.py: 1 warning -test/test_lov-fuseki_on_hold.py: 1 warning -test/test_rdf4j__geosciml.py: 1 warning -test/test_stardog__lindas.py: 1 warning -test/test_store__v1_1_4.py: 1 warning -test/test_virtuoso__v7_20_3230__dbpedia.py: 1 warning -test/test_virtuoso__v8_03_3313__dbpedia.py: 1 warning - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:663: UserWarning: keepalive support not available, so the execution of this method has no effect - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 2 warnings -test/test_allegrograph__v4_14_1__mmi.py: 4 warnings -test/test_blazegraph__wikidata.py: 4 warnings -test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings -test/test_fuseki2__v3_8_0__stw.py: 2 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings -test/test_lov-fuseki_on_hold.py: 2 warnings -test/test_rdf4j__geosciml.py: 2 warnings -test/test_stardog__lindas.py: 2 warnings -test/test_store__v1_1_4.py: 3 warnings -test/test_virtuoso__v7_20_3230__dbpedia.py: 4 warnings -test/test_virtuoso__v8_03_3313__dbpedia.py: 2 warnings -test/test_wrapper.py: 1 warning - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'json-ld' in a 'SELECT' SPARQL query form - warnings.warn( - -test/test_agrovoc-allegrograph_on_hold.py: 2 warnings -test/test_allegrograph__v4_14_1__mmi.py: 4 warnings -test/test_blazegraph__wikidata.py: 4 warnings -test/test_cli.py: 1 warning -test/test_fuseki2__v3_6_0__agrovoc.py: 2 warnings -test/test_fuseki2__v3_8_0__stw.py: 2 warnings -test/test_graphdbEnterprise__v8_9_0__rs.py: 2 warnings -test/test_lov-fuseki_on_hold.py: 2 warnings -test/test_rdf4j__geosciml.py: 2 warnings -test/test_stardog__lindas.py: 2 warnings -test/test_store__v1_1_4.py: 3 warnings - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'n3' in a 'SELECT' SPARQL query form - warnings.warn( - -test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:794: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf' in a 'DESCRIBE' SPARQL query form - warnings.warn( - -test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'rdf+xml' in a 'SELECT' SPARQL query form - warnings.warn( - -test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle - /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/Wrapper.py:778: RuntimeWarning: Sending Accept header '*/*' because unexpected returned format 'turtle' in a 'SELECT' SPARQL query form - warnings.warn( - --- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html -=========================== short test summary info ============================ -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByGETinXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSON -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinJSONLD -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinUnknow -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testAskByPOSTinXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinN3 -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinUnknow -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testConstructByPOSTinXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinN3 -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testDescribeByPOSTinXML -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testKeepAlive -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryBadFormed -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinCSV -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinJSON -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinTSV -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinUnknow -FAILED test/test_agrovoc-allegrograph_on_hold.py::SPARQLWrapperTests::testSelectByPOSTinXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3 -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3 -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3 -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3 -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testKeepAlive -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryBadFormed -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryDuplicatedPrefix -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML -FAILED test/test_allegrograph__v4_14_1__mmi.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testKeepAlive -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryBadFormed -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_1 -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML -FAILED test/test_blazegraph__wikidata.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg -FAILED test/test_cli.py::SPARQLWrapperCLIParser_Test::testInvalidFormat - Ass... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryRDF - urllib.error.U... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryTo4store - urllib.er... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAgrovoc_AllegroGraph -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToAllegroGraph - url... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToBrazeGraph - urlli... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_6 - urll... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToFuseki2V3_8 - urll... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToGraphDBEnterprise -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToLovFuseki - urllib... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToRDF4J - urllib.err... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToStardog - urllib.e... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV7 - urlli... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryToVirtuosoV8 - urlli... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithEndpoint - urlli... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFile - urllib.er... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileCSV - urllib... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileN3 - urllib.... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileRDFXML - url... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTSV - urllib... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtle - url... -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileTurtleQuiet -FAILED test/test_cli.py::SPARQLWrapperCLI_Test::testQueryWithFileXML - urllib... -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testKeepAlive -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryBadFormed -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryDuplicatedPrefix -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_1 -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML -FAILED test/test_fuseki2__v3_6_0__agrovoc.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinCSV_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinTSV_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testKeepAlive -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryBadFormed -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryDuplicatedPrefix -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_1 -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML -FAILED test/test_fuseki2__v3_8_0__stw.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testKeepAlive -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryBadFormed -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_1 -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg -FAILED test/test_graphdbEnterprise__v8_9_0__rs.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3 -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3 -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testKeepAlive - u... -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryDuplicatedPrefix -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML -FAILED test/test_lov-fuseki_on_hold.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testKeepAlive - urll... -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryBadFormed_1 -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_1 -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg -FAILED test/test_rdf4j__geosciml.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinN3_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinUnknow_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testAskByPOSTinXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSONLD_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinJSON_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinN3_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinRDFXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinTURTLE_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testConstructByPOSTinXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinCSV_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSONLD_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinJSON_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinN3_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinRDFXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinTURTLE_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinUnknow_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testDescribeByPOSTinXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testKeepAlive - urll... -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryDuplicatedPrefix -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_1 -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinCSV_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinJSON_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinN3_Unexpected_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinTSV_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinUnknow_Conneg -FAILED test/test_stardog__lindas.py::SPARQLWrapperTests::testSelectByPOSTinXML_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV - ur... -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON - u... -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSONLD_Unexpected_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinN3_Unexpected_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinJSON_Unexpected_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinJSON_Unexpected_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testKeepAlive - urllib... -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryBadFormed - u... -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryDuplicatedPrefix -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinN3_Unexpected_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_store__v1_1_4.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testAskByPOSTinJSON_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinCSV_Unexpected_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinN3 -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testConstructByPOSTinUnknow_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testKeepAlive -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected -FAILED test/test_virtuoso__v7_20_3230__dbpedia.py::SPARQLWrapperTests::testSelectByPOSTinJSONLD_Unexpected_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinCSV_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinJSON_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinTSV_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinUnknow_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testAskByGETinXML_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinCSV_Unexpected_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinJSONLD_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3 -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinN3_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinRDFXML_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinTURTLE_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinUnknow_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testConstructByGETinXML_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinCSV_Unexpected_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinJSONLD_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3 -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinN3_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinRDFXML_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinTURTLE_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinUnknow_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testDescribeByGETinXML_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testKeepAlive -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryBadFormed -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryDuplicatedPrefix -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryManyPrefixes -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_1 -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testQueryWithComma_3 -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinCSV_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSONLD_Unexpected_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinJSON_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinTSV_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinUnknow_Conneg -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML -FAILED test/test_virtuoso__v8_03_3313__dbpedia.py::SPARQLWrapperTests::testSelectByGETinXML_Conneg -FAILED test/test_wrapper.py::QueryResult_Test::testConvert - AssertionError: ... -= 858 failed, 38 passed, 549 skipped, 80 xfailed, 381 warnings in 760.92s (0:12:40) = -E: pybuild pybuild:389: test: plugin distutils failed with: exit code=1: cd /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build; python3.13 -m pytest test -dh_auto_test: error: pybuild --test --test-pytest -i python{version} -p 3.13 returned exit code 13 -make[1]: Leaving directory '/build/reproducible-path/sparql-wrapper-python-2.0.0' +dh: command-omitted: The call to "debian/rules override_dh_auto_test" was omitted due to "DEB_BUILD_OPTIONS=nocheck" create-stamp debian/debhelper-build-stamp dh_testroot -O--buildsystem=pybuild dh_prep -O--buildsystem=pybuild @@ -169460,23 +607,7 @@ running build_py running install_lib creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages -creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache -creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v -creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/v/cache/lastfailed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/v/cache/nodeids -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache/v/cache -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/CACHEDIR.TAG -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/.gitignore -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/.pytest_cache/README.md -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/.pytest_cache creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper -creating /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/main.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/sparql_dataframe.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/SPARQLExceptions.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/KeyCaseInsensitiveDict.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/Wrapper.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/SmartWrapper.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ -copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/__pycache__/__init__.cpython-313.pyc -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper/__pycache__ copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/py.typed -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/KeyCaseInsensitiveDict.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper copying /build/reproducible-path/sparql-wrapper-python-2.0.0/.pybuild/cpython3_3.13_sparqlwrapper/build/SPARQLWrapper/SPARQLExceptions.py -> /build/reproducible-path/sparql-wrapper-python-2.0.0/debian/python3-sparqlwrapper/usr/lib/python3.13/dist-packages/SPARQLWrapper @@ -169536,12 +667,14 @@ dpkg-buildpackage: info: binary-only upload (no source included) dpkg-genchanges: info: not including original source code in upload I: copying local configuration +I: user script /srv/workspace/pbuilder/188717/tmp/hooks/B01_cleanup starting +I: user script /srv/workspace/pbuilder/188717/tmp/hooks/B01_cleanup finished I: unmounting dev/ptmx filesystem I: unmounting dev/pts filesystem I: unmounting dev/shm filesystem I: unmounting proc filesystem I: unmounting sys filesystem I: cleaning the build env -I: removing directory /srv/workspace/pbuilder/4132079 and its subdirectories -I: Current time: Fri Oct 31 10:01:44 -12 2025 -I: pbuilder-time-stamp: 1761948104 +I: removing directory /srv/workspace/pbuilder/188717 and its subdirectories +I: Current time: Fri Dec 4 18:25:26 +14 2026 +I: pbuilder-time-stamp: 1796358326 Compressing the 2nd log... /var/lib/jenkins/userContent/reproducible/debian/logdiffs/forky/arm64/sparql-wrapper-python_2.0.0-2.diff: 98.5% -- replaced with /var/lib/jenkins/userContent/reproducible/debian/logdiffs/forky/arm64/sparql-wrapper-python_2.0.0-2.diff.gz b2/build.log: 80.9% -- replaced with stdout Compressing the 1st log... b1/build.log: 98.5% -- replaced with stdout Fri Oct 31 22:02:29 UTC 2025 I: diffoscope 306 will be used to compare the two builds: ++ date -u +%s + DIFFOSCOPE_STAMP=/var/log/reproducible-builds/diffoscope_stamp_sparql-wrapper-python_forky_arm64_1761948149 + touch /var/log/reproducible-builds/diffoscope_stamp_sparql-wrapper-python_forky_arm64_1761948149 + RESULT=0 + systemd-run '--description=diffoscope on sparql-wrapper-python/2.0.0-2 in forky/arm64' --slice=rb-build-diffoscope.slice -u rb-diffoscope-arm64_11-129892 '--property=SuccessExitStatus=1 124' --user --send-sighup --pipe --wait -E TMPDIR timeout 155m nice schroot --directory /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy --run-session -c jenkins-reproducible-forky-diffoscope-fd9c602c-9d21-4ad5-9ffa-b1f30bc8163c -- sh -c 'export TMPDIR=/srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/dbd-tmp-W1TZgZB ; timeout 150m diffoscope --timeout 7200 --html /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/sparql-wrapper-python_2.0.0-2.diffoscope.html --text /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/sparql-wrapper-python_2.0.0-2.diffoscope.txt --json /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/sparql-wrapper-python_2.0.0-2.diffoscope.json --profile=- /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b1/sparql-wrapper-python_2.0.0-2_arm64.changes /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b2/sparql-wrapper-python_2.0.0-2_arm64.changes' + false + set +x Running as unit: rb-diffoscope-arm64_11-129892.service; invocation ID: 8927c9daf2244bea864515f299c940f3 # Profiling output for: /usr/bin/diffoscope --timeout 7200 --html /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/sparql-wrapper-python_2.0.0-2.diffoscope.html --text /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/sparql-wrapper-python_2.0.0-2.diffoscope.txt --json /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/sparql-wrapper-python_2.0.0-2.diffoscope.json --profile=- /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b1/sparql-wrapper-python_2.0.0-2_arm64.changes /srv/reproducible-results/rbuild-debian/r-b-build.WPLpZzUy/b2/sparql-wrapper-python_2.0.0-2_arm64.changes ## command (total time: 0.000s) 0.000s 1 call cmp (internal) ## has_same_content_as (total time: 0.000s) 0.000s 1 call diffoscope.comparators.binary.FilesystemFile ## main (total time: 0.003s) 0.003s 2 calls outputs 0.000s 1 call cleanup Finished with result: success Main processes terminated with: code=exited, status=0/SUCCESS Service runtime: 230ms CPU time consumed: 172ms Memory peak: 17.6M (swap: 0B) _ ___ _ __ __ _ _ __ __ _| | __ ___ __ __ _ _ __ _ __ ___ _ __ / __| '_ \ / _` | '__/ _` | |____\ \ /\ / / '__/ _` | '_ \| '_ \ / _ \ '__| \__ \ |_) | (_| | | | (_| | |_____\ V V /| | | (_| | |_) | |_) | __/ | |___/ .__/ \__,_|_| \__, |_| \_/\_/ |_| \__,_| .__/| .__/ \___|_| |_| |_| |_| |_| _ _ _ __ _ _| |_| |__ ___ _ __ _____| '_ \| | | | __| '_ \ / _ \| '_ \ |_____| |_) | |_| | |_| | | | (_) | | | | | .__/ \__, |\__|_| |_|\___/|_| |_| |_| |___/ Fri Oct 31 22:02:29 UTC 2025 I: diffoscope 306 found no differences in the changes files, and a .buildinfo file also exists. Fri Oct 31 22:02:29 UTC 2025 I: sparql-wrapper-python from forky built successfully and reproducibly on arm64. INSERT 0 1 INSERT 0 1 DELETE 1 [2025-10-31 22:02:29] INFO: Starting at 2025-10-31 22:02:29.889102 [2025-10-31 22:02:30] INFO: Generating the pages of 1 package(s) [2025-10-31 22:02:30] CRITICAL: https://tests.reproducible-builds.org/debian/forky/arm64/sparql-wrapper-python didn't produce a buildlog, even though it has been built. [2025-10-31 22:02:30] INFO: Finished at 2025-10-31 22:02:30.168083, took: 0:00:00.278984 Fri Oct 31 22:02:30 UTC 2025 - successfully updated the database and updated https://tests.reproducible-builds.org/debian/rb-pkg/forky/arm64/sparql-wrapper-python.html Fri Oct 31 22:02:30 UTC 2025 I: Removing signed sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc files: removed './b1/sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc' removed './b2/sparql-wrapper-python_2.0.0-2_arm64.buildinfo.asc' 1761948150 arm64 forky sparql-wrapper-python Starting cleanup. /var/lib/jenkins/userContent/reproducible/debian/rbuild/forky/arm64/sparql-wrapper-python_2.0.0-2.rbuild.log: 98.4% -- replaced with /var/lib/jenkins/userContent/reproducible/debian/rbuild/forky/arm64/sparql-wrapper-python_2.0.0-2.rbuild.log.gz [2025-10-31 22:02:30] INFO: Starting at 2025-10-31 22:02:30.565169 [2025-10-31 22:02:30] INFO: Generating the pages of 1 package(s) [2025-10-31 22:02:30] INFO: Finished at 2025-10-31 22:02:30.845031, took: 0:00:00.279865 All cleanup done. Fri Oct 31 22:02:30 UTC 2025 - total duration: 0h 14m 9s. Fri Oct 31 22:02:30 UTC 2025 - reproducible_build.sh stopped running as /tmp/jenkins-script-6jTjdR93, removing. Finished with result: success Main processes terminated with: code=exited, status=0/SUCCESS Service runtime: 14min 10.341s CPU time consumed: 3.942s Memory peak: 78.3M (swap: 0B)